46400 1727204509.24921: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 46400 1727204509.25419: Added group all to inventory 46400 1727204509.25422: Added group ungrouped to inventory 46400 1727204509.25426: Group all now contains ungrouped 46400 1727204509.25430: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 46400 1727204509.52033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 46400 1727204509.52098: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 46400 1727204509.52121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 46400 1727204509.52184: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 46400 1727204509.52257: Loaded config def from plugin (inventory/script) 46400 1727204509.52260: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 46400 1727204509.52303: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 46400 1727204509.53009: Loaded config def from plugin (inventory/yaml) 46400 1727204509.53011: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 46400 1727204509.53192: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 46400 1727204509.53747: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 46400 1727204509.53751: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 46400 1727204509.53754: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 46400 1727204509.53762: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 46400 1727204509.53769: Loading data from /tmp/network-M6W/inventory-5vW.yml 46400 1727204509.53848: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 46400 1727204509.53918: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 46400 1727204509.53970: Loading data from /tmp/network-M6W/inventory-5vW.yml 46400 1727204509.54059: group all already in inventory 46400 1727204509.54074: set inventory_file for managed-node1 46400 1727204509.54079: set inventory_dir for managed-node1 46400 1727204509.54080: Added host managed-node1 to inventory 46400 1727204509.54082: Added host managed-node1 to group all 46400 1727204509.54083: set ansible_host for managed-node1 46400 1727204509.54084: set ansible_ssh_extra_args for managed-node1 46400 1727204509.54088: set inventory_file for managed-node2 46400 1727204509.54090: set inventory_dir for managed-node2 46400 1727204509.54091: Added host managed-node2 to inventory 46400 1727204509.54093: Added host managed-node2 to group all 46400 1727204509.54094: set ansible_host for managed-node2 46400 1727204509.54094: set ansible_ssh_extra_args for managed-node2 46400 1727204509.54097: set inventory_file for managed-node3 46400 1727204509.54099: set inventory_dir for managed-node3 46400 1727204509.54100: Added host managed-node3 to inventory 46400 1727204509.54101: Added host managed-node3 to group all 46400 1727204509.54102: set ansible_host for managed-node3 46400 1727204509.54103: set ansible_ssh_extra_args for managed-node3 46400 1727204509.54105: Reconcile groups and hosts in inventory. 46400 1727204509.54109: Group ungrouped now contains managed-node1 46400 1727204509.54111: Group ungrouped now contains managed-node2 46400 1727204509.54113: Group ungrouped now contains managed-node3 46400 1727204509.54201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 46400 1727204509.54334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 46400 1727204509.54391: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 46400 1727204509.54424: Loaded config def from plugin (vars/host_group_vars) 46400 1727204509.54426: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 46400 1727204509.54433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 46400 1727204509.54442: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 46400 1727204509.54494: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 46400 1727204509.54878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204509.55296: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 46400 1727204509.55335: Loaded config def from plugin (connection/local) 46400 1727204509.55338: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 46400 1727204509.56029: Loaded config def from plugin (connection/paramiko_ssh) 46400 1727204509.56032: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 46400 1727204509.57070: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 46400 1727204509.57116: Loaded config def from plugin (connection/psrp) 46400 1727204509.57150: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 46400 1727204509.58059: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 46400 1727204509.58110: Loaded config def from plugin (connection/ssh) 46400 1727204509.58114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 46400 1727204509.58495: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 46400 1727204509.58540: Loaded config def from plugin (connection/winrm) 46400 1727204509.58543: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 46400 1727204509.58582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 46400 1727204509.58651: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 46400 1727204509.58722: Loaded config def from plugin (shell/cmd) 46400 1727204509.58724: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 46400 1727204509.58757: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 46400 1727204509.58827: Loaded config def from plugin (shell/powershell) 46400 1727204509.58829: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 46400 1727204509.58894: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 46400 1727204509.59089: Loaded config def from plugin (shell/sh) 46400 1727204509.59092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 46400 1727204509.59126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 46400 1727204509.59316: Loaded config def from plugin (become/runas) 46400 1727204509.59318: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 46400 1727204509.59946: Loaded config def from plugin (become/su) 46400 1727204509.59948: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 46400 1727204509.60130: Loaded config def from plugin (become/sudo) 46400 1727204509.60133: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 46400 1727204509.60174: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 46400 1727204509.60591: in VariableManager get_vars() 46400 1727204509.60614: done with get_vars() 46400 1727204509.60757: trying /usr/local/lib/python3.12/site-packages/ansible/modules 46400 1727204509.65944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 46400 1727204509.66103: in VariableManager get_vars() 46400 1727204509.66113: done with get_vars() 46400 1727204509.66116: variable 'playbook_dir' from source: magic vars 46400 1727204509.66117: variable 'ansible_playbook_python' from source: magic vars 46400 1727204509.66118: variable 'ansible_config_file' from source: magic vars 46400 1727204509.66118: variable 'groups' from source: magic vars 46400 1727204509.66120: variable 'omit' from source: magic vars 46400 1727204509.66120: variable 'ansible_version' from source: magic vars 46400 1727204509.66121: variable 'ansible_check_mode' from source: magic vars 46400 1727204509.66122: variable 'ansible_diff_mode' from source: magic vars 46400 1727204509.66123: variable 'ansible_forks' from source: magic vars 46400 1727204509.66123: variable 'ansible_inventory_sources' from source: magic vars 46400 1727204509.66124: variable 'ansible_skip_tags' from source: magic vars 46400 1727204509.66125: variable 'ansible_limit' from source: magic vars 46400 1727204509.66129: variable 'ansible_run_tags' from source: magic vars 46400 1727204509.66130: variable 'ansible_verbosity' from source: magic vars 46400 1727204509.66169: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 46400 1727204509.67516: in VariableManager get_vars() 46400 1727204509.67534: done with get_vars() 46400 1727204509.67705: in VariableManager get_vars() 46400 1727204509.67722: done with get_vars() 46400 1727204509.67777: in VariableManager get_vars() 46400 1727204509.67909: done with get_vars() 46400 1727204509.67961: in VariableManager get_vars() 46400 1727204509.67976: done with get_vars() 46400 1727204509.68117: in VariableManager get_vars() 46400 1727204509.68197: done with get_vars() 46400 1727204509.68548: in VariableManager get_vars() 46400 1727204509.68570: done with get_vars() 46400 1727204509.68632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 46400 1727204509.68653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 46400 1727204509.69040: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 46400 1727204509.69423: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 46400 1727204509.69426: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 46400 1727204509.69468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 46400 1727204509.69493: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 46400 1727204509.69658: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 46400 1727204509.69722: Loaded config def from plugin (callback/default) 46400 1727204509.69724: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 46400 1727204509.71243: Loaded config def from plugin (callback/junit) 46400 1727204509.71246: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 46400 1727204509.71304: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 46400 1727204509.71386: Loaded config def from plugin (callback/minimal) 46400 1727204509.71388: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 46400 1727204509.71427: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 46400 1727204509.71496: Loaded config def from plugin (callback/tree) 46400 1727204509.71499: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 46400 1727204509.71639: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 46400 1727204509.71642: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 46400 1727204509.71679: in VariableManager get_vars() 46400 1727204509.71701: done with get_vars() 46400 1727204509.71707: in VariableManager get_vars() 46400 1727204509.71715: done with get_vars() 46400 1727204509.71719: variable 'omit' from source: magic vars 46400 1727204509.71758: in VariableManager get_vars() 46400 1727204509.71776: done with get_vars() 46400 1727204509.71804: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 46400 1727204509.72521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 46400 1727204509.72730: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 46400 1727204509.73365: getting the remaining hosts for this loop 46400 1727204509.73367: done getting the remaining hosts for this loop 46400 1727204509.73370: getting the next task for host managed-node2 46400 1727204509.73374: done getting next task for host managed-node2 46400 1727204509.73376: ^ task is: TASK: Gathering Facts 46400 1727204509.73377: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204509.73379: getting variables 46400 1727204509.73380: in VariableManager get_vars() 46400 1727204509.73390: Calling all_inventory to load vars for managed-node2 46400 1727204509.73392: Calling groups_inventory to load vars for managed-node2 46400 1727204509.73395: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204509.73406: Calling all_plugins_play to load vars for managed-node2 46400 1727204509.73418: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204509.73421: Calling groups_plugins_play to load vars for managed-node2 46400 1727204509.73462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204509.73518: done with get_vars() 46400 1727204509.73525: done getting variables 46400 1727204509.73602: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Tuesday 24 September 2024 15:01:49 -0400 (0:00:00.020) 0:00:00.020 ***** 46400 1727204509.73624: entering _queue_task() for managed-node2/gather_facts 46400 1727204509.73626: Creating lock for gather_facts 46400 1727204509.74159: worker is 1 (out of 1 available) 46400 1727204509.74174: exiting _queue_task() for managed-node2/gather_facts 46400 1727204509.74189: done queuing things up, now waiting for results queue to drain 46400 1727204509.74191: waiting for pending results... 46400 1727204509.74428: running TaskExecutor() for managed-node2/TASK: Gathering Facts 46400 1727204509.74523: in run() - task 0affcd87-79f5-1303-fda8-00000000001b 46400 1727204509.74549: variable 'ansible_search_path' from source: unknown 46400 1727204509.74593: calling self._execute() 46400 1727204509.74671: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204509.74683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204509.74695: variable 'omit' from source: magic vars 46400 1727204509.74798: variable 'omit' from source: magic vars 46400 1727204509.74828: variable 'omit' from source: magic vars 46400 1727204509.74876: variable 'omit' from source: magic vars 46400 1727204509.74923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204509.74968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204509.74999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204509.75020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204509.75034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204509.75071: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204509.75080: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204509.75097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204509.75202: Set connection var ansible_shell_type to sh 46400 1727204509.75217: Set connection var ansible_shell_executable to /bin/sh 46400 1727204509.75226: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204509.75234: Set connection var ansible_connection to ssh 46400 1727204509.75244: Set connection var ansible_pipelining to False 46400 1727204509.75252: Set connection var ansible_timeout to 10 46400 1727204509.75284: variable 'ansible_shell_executable' from source: unknown 46400 1727204509.75291: variable 'ansible_connection' from source: unknown 46400 1727204509.75297: variable 'ansible_module_compression' from source: unknown 46400 1727204509.75309: variable 'ansible_shell_type' from source: unknown 46400 1727204509.75316: variable 'ansible_shell_executable' from source: unknown 46400 1727204509.75321: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204509.75328: variable 'ansible_pipelining' from source: unknown 46400 1727204509.75334: variable 'ansible_timeout' from source: unknown 46400 1727204509.75341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204509.75538: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 46400 1727204509.75552: variable 'omit' from source: magic vars 46400 1727204509.75563: starting attempt loop 46400 1727204509.75572: running the handler 46400 1727204509.75590: variable 'ansible_facts' from source: unknown 46400 1727204509.75611: _low_level_execute_command(): starting 46400 1727204509.75622: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204509.76417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204509.76431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204509.76446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204509.76468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204509.76516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204509.76529: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204509.76541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204509.76563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204509.76577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204509.76587: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204509.76599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204509.76611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204509.76632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204509.76643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204509.76656: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204509.76675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204509.76757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204509.76792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204509.76811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204509.76890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204509.78545: stdout chunk (state=3): >>>/root <<< 46400 1727204509.78740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204509.78743: stdout chunk (state=3): >>><<< 46400 1727204509.78746: stderr chunk (state=3): >>><<< 46400 1727204509.78869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204509.78872: _low_level_execute_command(): starting 46400 1727204509.78875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373 `" && echo ansible-tmp-1727204509.787712-46487-192195880851373="` echo /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373 `" ) && sleep 0' 46400 1727204509.80133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204509.80137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204509.80183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204509.80187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204509.80189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204509.80238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204509.80986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204509.80990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204509.81042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204509.82911: stdout chunk (state=3): >>>ansible-tmp-1727204509.787712-46487-192195880851373=/root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373 <<< 46400 1727204509.83032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204509.83112: stderr chunk (state=3): >>><<< 46400 1727204509.83116: stdout chunk (state=3): >>><<< 46400 1727204509.83373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204509.787712-46487-192195880851373=/root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204509.83377: variable 'ansible_module_compression' from source: unknown 46400 1727204509.83379: ANSIBALLZ: Using generic lock for ansible.legacy.setup 46400 1727204509.83381: ANSIBALLZ: Acquiring lock 46400 1727204509.83383: ANSIBALLZ: Lock acquired: 140519374124768 46400 1727204509.83385: ANSIBALLZ: Creating module 46400 1727204510.50559: ANSIBALLZ: Writing module into payload 46400 1727204510.51768: ANSIBALLZ: Writing module 46400 1727204510.51850: ANSIBALLZ: Renaming module 46400 1727204510.52043: ANSIBALLZ: Done creating module 46400 1727204510.52091: variable 'ansible_facts' from source: unknown 46400 1727204510.52149: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204510.52167: _low_level_execute_command(): starting 46400 1727204510.52178: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 46400 1727204510.53974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.53979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.54007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.54011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.54013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.54250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204510.54254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204510.54256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204510.54407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204510.56074: stdout chunk (state=3): >>>PLATFORM <<< 46400 1727204510.56158: stdout chunk (state=3): >>>Linux <<< 46400 1727204510.56165: stdout chunk (state=3): >>>FOUND <<< 46400 1727204510.56169: stdout chunk (state=3): >>>/usr/bin/python3.9 <<< 46400 1727204510.56183: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 46400 1727204510.56318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204510.56406: stderr chunk (state=3): >>><<< 46400 1727204510.56409: stdout chunk (state=3): >>><<< 46400 1727204510.56531: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204510.56537 [managed-node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 46400 1727204510.56540: _low_level_execute_command(): starting 46400 1727204510.56542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 46400 1727204510.56948: Sending initial data 46400 1727204510.56951: Sent initial data (1181 bytes) 46400 1727204510.57532: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.57539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.57570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204510.57583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.57585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.57904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204510.57908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204510.57913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204510.57979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 46400 1727204510.62629: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 46400 1727204510.63178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204510.63270: stderr chunk (state=3): >>><<< 46400 1727204510.63274: stdout chunk (state=3): >>><<< 46400 1727204510.63276: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 46400 1727204510.63487: variable 'ansible_facts' from source: unknown 46400 1727204510.63490: variable 'ansible_facts' from source: unknown 46400 1727204510.63492: variable 'ansible_module_compression' from source: unknown 46400 1727204510.63495: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 46400 1727204510.63497: variable 'ansible_facts' from source: unknown 46400 1727204510.64403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/AnsiballZ_setup.py 46400 1727204510.65096: Sending initial data 46400 1727204510.65099: Sent initial data (153 bytes) 46400 1727204510.67195: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204510.67385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.67406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.67425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.67470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204510.67483: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204510.67498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.67517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204510.67529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204510.67540: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204510.67553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.67569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.67585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.67598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204510.67607: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204510.67621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.67688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204510.67703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204510.67716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204510.67949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204510.69549: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204510.69591: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204510.69631: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpyn8_p6ft /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/AnsiballZ_setup.py <<< 46400 1727204510.69674: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204510.72587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204510.72770: stderr chunk (state=3): >>><<< 46400 1727204510.72773: stdout chunk (state=3): >>><<< 46400 1727204510.72781: done transferring module to remote 46400 1727204510.72784: _low_level_execute_command(): starting 46400 1727204510.72786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/ /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/AnsiballZ_setup.py && sleep 0' 46400 1727204510.74653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.74657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.74692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.74696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.74699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.74815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204510.74818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.74883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204510.74997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204510.75001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204510.75058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204510.77368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204510.77450: stderr chunk (state=3): >>><<< 46400 1727204510.77454: stdout chunk (state=3): >>><<< 46400 1727204510.77524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204510.77528: _low_level_execute_command(): starting 46400 1727204510.77530: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/AnsiballZ_setup.py && sleep 0' 46400 1727204510.79410: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204510.79426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.79442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.79461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.79510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204510.79608: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204510.79622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.79641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204510.79654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204510.79667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204510.79680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204510.79695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204510.79714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204510.79725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204510.79734: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204510.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204510.79885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204510.79908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204510.79930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204510.80018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204510.81959: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 46400 1727204510.81963: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 46400 1727204510.82027: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 46400 1727204510.82072: stdout chunk (state=3): >>>import 'posix' # <<< 46400 1727204510.82107: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 46400 1727204510.82111: stdout chunk (state=3): >>># installing zipimport hook <<< 46400 1727204510.82139: stdout chunk (state=3): >>>import 'time' # <<< 46400 1727204510.82151: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 46400 1727204510.82203: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.82223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 46400 1727204510.82246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 46400 1727204510.82249: stdout chunk (state=3): >>>import '_codecs' # <<< 46400 1727204510.82279: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8dc0> <<< 46400 1727204510.82313: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 46400 1727204510.82341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d3a0> <<< 46400 1727204510.82344: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8b20> <<< 46400 1727204510.82368: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 46400 1727204510.82381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8ac0> <<< 46400 1727204510.82407: stdout chunk (state=3): >>>import '_signal' # <<< 46400 1727204510.82429: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 46400 1727204510.82446: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d490> <<< 46400 1727204510.82471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 46400 1727204510.82510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 46400 1727204510.82513: stdout chunk (state=3): >>>import '_abc' # <<< 46400 1727204510.82524: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d940> <<< 46400 1727204510.82547: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d670> <<< 46400 1727204510.82590: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 46400 1727204510.82593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 46400 1727204510.82607: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 46400 1727204510.82633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 46400 1727204510.82648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 46400 1727204510.82673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 46400 1727204510.82698: stdout chunk (state=3): >>>import '_stat' # <<< 46400 1727204510.82701: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834190> <<< 46400 1727204510.82715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 46400 1727204510.82740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 46400 1727204510.82816: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834220> <<< 46400 1727204510.82843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 46400 1727204510.82846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 46400 1727204510.82881: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 46400 1727204510.82886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963857850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834940> <<< 46400 1727204510.82919: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963895880> <<< 46400 1727204510.82946: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 46400 1727204510.82950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 46400 1727204510.82952: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96382dd90> <<< 46400 1727204510.83016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 46400 1727204510.83019: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963857d90> <<< 46400 1727204510.83076: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d970> <<< 46400 1727204510.83103: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 46400 1727204510.83435: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 46400 1727204510.83455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 46400 1727204510.83476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 46400 1727204510.83488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 46400 1727204510.83502: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 46400 1727204510.83519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 46400 1727204510.83538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 46400 1727204510.83559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 46400 1727204510.83562: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d4f10> <<< 46400 1727204510.83615: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d90a0> <<< 46400 1727204510.83628: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 46400 1727204510.83644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 46400 1727204510.83658: stdout chunk (state=3): >>>import '_sre' # <<< 46400 1727204510.83685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 46400 1727204510.83696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 46400 1727204510.83721: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 46400 1727204510.83749: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635cc5b0> <<< 46400 1727204510.83768: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d36a0> <<< 46400 1727204510.83785: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d43d0> <<< 46400 1727204510.83797: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 46400 1727204510.83871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 46400 1727204510.83896: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 46400 1727204510.83923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.83943: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 46400 1727204510.83955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 46400 1727204510.83993: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.83997: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9634bae20> <<< 46400 1727204510.83999: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634ba910> <<< 46400 1727204510.84011: stdout chunk (state=3): >>>import 'itertools' # <<< 46400 1727204510.84030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634baf10> <<< 46400 1727204510.84055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 46400 1727204510.84070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 46400 1727204510.84087: stdout chunk (state=3): >>>import '_operator' # <<< 46400 1727204510.84100: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634bafd0> <<< 46400 1727204510.84124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 46400 1727204510.84139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cb0d0> <<< 46400 1727204510.84150: stdout chunk (state=3): >>>import '_collections' # <<< 46400 1727204510.84196: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635aed90> <<< 46400 1727204510.84210: stdout chunk (state=3): >>>import '_functools' # <<< 46400 1727204510.84234: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635a7670> <<< 46400 1727204510.84287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635ba6d0> <<< 46400 1727204510.84299: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635dae80> <<< 46400 1727204510.84320: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 46400 1727204510.84352: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.84367: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9634cbcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635ae2b0> <<< 46400 1727204510.84391: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.84409: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9635ba2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635e0a30> <<< 46400 1727204510.84442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 46400 1727204510.84454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 46400 1727204510.84470: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 46400 1727204510.84481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.84493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 46400 1727204510.84508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 46400 1727204510.84522: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbeb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbdf0> <<< 46400 1727204510.84577: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbd60> <<< 46400 1727204510.84592: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 46400 1727204510.84603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 46400 1727204510.84662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 46400 1727204510.84701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 46400 1727204510.84775: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96349e3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 46400 1727204510.84794: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96349e4c0> <<< 46400 1727204510.84917: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634d3f40> <<< 46400 1727204510.84953: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cda90> <<< 46400 1727204510.84979: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cd490> <<< 46400 1727204510.84996: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 46400 1727204510.85007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 46400 1727204510.85033: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 46400 1727204510.85052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 46400 1727204510.85071: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 46400 1727204510.85084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 46400 1727204510.85098: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633d2220> <<< 46400 1727204510.85124: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963489520> <<< 46400 1727204510.85179: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cdf10> <<< 46400 1727204510.85191: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635e00a0> <<< 46400 1727204510.85206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 46400 1727204510.85224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 46400 1727204510.85240: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 46400 1727204510.85252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633e4b50> <<< 46400 1727204510.85274: stdout chunk (state=3): >>>import 'errno' # <<< 46400 1727204510.85308: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.85323: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633e4e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 46400 1727204510.85337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 46400 1727204510.85357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 46400 1727204510.85371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 46400 1727204510.85382: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5790> <<< 46400 1727204510.85401: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 46400 1727204510.85432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 46400 1727204510.85461: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5cd0> <<< 46400 1727204510.85497: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.85509: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96338e400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633e4f70> <<< 46400 1727204510.85533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 46400 1727204510.85584: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96339f2e0> <<< 46400 1727204510.85599: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5610> <<< 46400 1727204510.85611: stdout chunk (state=3): >>>import 'pwd' # <<< 46400 1727204510.85642: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96339f3a0> <<< 46400 1727204510.85750: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cba30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 46400 1727204510.85811: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 46400 1727204510.85861: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bb7c0> <<< 46400 1727204510.85893: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb8b0> <<< 46400 1727204510.85910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 46400 1727204510.86107: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bbd00> <<< 46400 1727204510.86188: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633c6250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bb940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633aea90> <<< 46400 1727204510.86202: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cb610> <<< 46400 1727204510.86297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 46400 1727204510.86313: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bbaf0> <<< 46400 1727204510.86453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 46400 1727204510.86471: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff9632de6d0> <<< 46400 1727204510.86733: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 46400 1727204510.86834: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.86859: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 46400 1727204510.86876: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.86889: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.86903: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 46400 1727204510.86917: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.88140: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.89070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c820> <<< 46400 1727204510.89093: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.89122: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 46400 1727204510.89148: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 46400 1727204510.89184: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96321c160> <<< 46400 1727204510.89226: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c280> <<< 46400 1727204510.89255: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321cf70> <<< 46400 1727204510.89283: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 46400 1727204510.89286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 46400 1727204510.89332: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c4f0> <<< 46400 1727204510.89337: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321cd90> import 'atexit' # <<< 46400 1727204510.89369: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.89387: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96321cfd0> <<< 46400 1727204510.89390: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 46400 1727204510.89413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 46400 1727204510.89462: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c100> <<< 46400 1727204510.89488: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 46400 1727204510.89491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 46400 1727204510.89512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 46400 1727204510.89524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 46400 1727204510.89558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 46400 1727204510.89640: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631f10d0> <<< 46400 1727204510.89684: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.89688: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962bc8340> <<< 46400 1727204510.89715: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.89720: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962bc8040> <<< 46400 1727204510.89741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 46400 1727204510.89792: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962bc8ca0> <<< 46400 1727204510.89796: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963204dc0> <<< 46400 1727204510.89965: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9632043a0> <<< 46400 1727204510.89989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 46400 1727204510.89992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 46400 1727204510.90011: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963204fd0> <<< 46400 1727204510.90028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 46400 1727204510.90045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 46400 1727204510.90081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 46400 1727204510.90084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 46400 1727204510.90099: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 46400 1727204510.90110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 46400 1727204510.90146: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 46400 1727204510.90149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 46400 1727204510.90151: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963251d30> <<< 46400 1727204510.90222: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203d30> <<< 46400 1727204510.90225: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631cfb20> <<< 46400 1727204510.90267: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.90270: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963203520> <<< 46400 1727204510.90293: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203550> <<< 46400 1727204510.90334: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 46400 1727204510.90355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 46400 1727204510.90389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 46400 1727204510.90459: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963164fd0> <<< 46400 1727204510.90462: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963263250> <<< 46400 1727204510.90484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 46400 1727204510.90497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 46400 1727204510.90559: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963161850> <<< 46400 1727204510.90562: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9632633d0> <<< 46400 1727204510.90585: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 46400 1727204510.90624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.90641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 46400 1727204510.90654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 46400 1727204510.90719: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963263ca0> <<< 46400 1727204510.90850: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631617f0> <<< 46400 1727204510.90938: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9631fcc10> <<< 46400 1727204510.90978: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.90981: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963263fa0> <<< 46400 1727204510.91020: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.91025: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963263550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96325c910> <<< 46400 1727204510.91053: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 46400 1727204510.91074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 46400 1727204510.91091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 46400 1727204510.91137: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963155940> <<< 46400 1727204510.91316: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963173d90> <<< 46400 1727204510.91320: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963160580> <<< 46400 1727204510.91368: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.91374: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963155ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631609a0> <<< 46400 1727204510.91399: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 46400 1727204510.91402: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 46400 1727204510.91416: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.91491: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.91567: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 46400 1727204510.91581: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 46400 1727204510.91598: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.91614: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 46400 1727204510.91631: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.91727: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.91831: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.92273: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.93046: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96319c7f0> <<< 46400 1727204510.93349: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631a18b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627c4970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 46400 1727204510.93994: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631da730> # zipimport: zlib available <<< 46400 1727204510.94262: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204510.94268: stdout chunk (state=3): >>> <<< 46400 1727204510.94881: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.94959: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95062: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 46400 1727204510.95069: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95113: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95151: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 46400 1727204510.95166: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95246: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95381: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 46400 1727204510.95429: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95442: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 46400 1727204510.95643: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.95826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 46400 1727204510.95858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 46400 1727204510.95861: stdout chunk (state=3): >>>import '_ast' # <<< 46400 1727204510.95931: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321e370> # zipimport: zlib available <<< 46400 1727204510.95999: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96069: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 46400 1727204510.96074: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 46400 1727204510.96093: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96127: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96165: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 46400 1727204510.96208: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96247: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96337: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96403: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 46400 1727204510.96422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204510.96497: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.96500: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96318f550> <<< 46400 1727204510.96591: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962640eb0> <<< 46400 1727204510.96611: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 46400 1727204510.96626: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96678: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96729: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.96755: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97283: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631967f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963194790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96318fb50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 46400 1727204510.97370: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 46400 1727204510.97374: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97376: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 46400 1727204510.97388: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97459: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97534: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97557: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97577: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97626: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.97680: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204510.98000: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 46400 1727204510.98045: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py<<< 46400 1727204510.98048: stdout chunk (state=3): >>> <<< 46400 1727204510.98082: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204510.98085: stdout chunk (state=3): >>> <<< 46400 1727204510.98330: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204510.98333: stdout chunk (state=3): >>> <<< 46400 1727204510.98569: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204510.98572: stdout chunk (state=3): >>> <<< 46400 1727204510.98637: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204510.98640: stdout chunk (state=3): >>> <<< 46400 1727204510.98726: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py<<< 46400 1727204510.98729: stdout chunk (state=3): >>> <<< 46400 1727204510.98731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc'<<< 46400 1727204510.98742: stdout chunk (state=3): >>> <<< 46400 1727204510.98786: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py<<< 46400 1727204510.98800: stdout chunk (state=3): >>> <<< 46400 1727204510.98814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 46400 1727204510.98858: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py<<< 46400 1727204510.98861: stdout chunk (state=3): >>> <<< 46400 1727204510.98894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc'<<< 46400 1727204510.98897: stdout chunk (state=3): >>> <<< 46400 1727204510.98941: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96278a370><<< 46400 1727204510.98944: stdout chunk (state=3): >>> <<< 46400 1727204510.98988: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 46400 1727204510.99014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc'<<< 46400 1727204510.99017: stdout chunk (state=3): >>> <<< 46400 1727204510.99051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 46400 1727204510.99107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc'<<< 46400 1727204510.99120: stdout chunk (state=3): >>> <<< 46400 1727204510.99150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 46400 1727204510.99176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc'<<< 46400 1727204510.99188: stdout chunk (state=3): >>> <<< 46400 1727204510.99218: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627a7a90> <<< 46400 1727204510.99268: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204510.99283: stdout chunk (state=3): >>> <<< 46400 1727204510.99297: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204510.99316: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9627a7b20><<< 46400 1727204510.99333: stdout chunk (state=3): >>> <<< 46400 1727204510.99423: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96277a280><<< 46400 1727204510.99440: stdout chunk (state=3): >>> <<< 46400 1727204510.99468: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96278a970> <<< 46400 1727204510.99506: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9625457f0><<< 46400 1727204510.99517: stdout chunk (state=3): >>> <<< 46400 1727204510.99548: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962545b20> <<< 46400 1727204510.99574: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py<<< 46400 1727204510.99588: stdout chunk (state=3): >>> <<< 46400 1727204510.99622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc'<<< 46400 1727204510.99632: stdout chunk (state=3): >>> <<< 46400 1727204510.99669: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 46400 1727204510.99690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc'<<< 46400 1727204510.99702: stdout chunk (state=3): >>> <<< 46400 1727204510.99739: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204510.99762: stdout chunk (state=3): >>> # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204510.99782: stdout chunk (state=3): >>> import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9627ed0a0><<< 46400 1727204510.99792: stdout chunk (state=3): >>> <<< 46400 1727204510.99815: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962787f70> <<< 46400 1727204510.99850: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py<<< 46400 1727204510.99871: stdout chunk (state=3): >>> <<< 46400 1727204510.99888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc'<<< 46400 1727204510.99898: stdout chunk (state=3): >>> <<< 46400 1727204510.99936: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627ed190><<< 46400 1727204510.99947: stdout chunk (state=3): >>> <<< 46400 1727204510.99984: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 46400 1727204511.00018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc'<<< 46400 1727204511.00030: stdout chunk (state=3): >>> <<< 46400 1727204511.00070: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204511.00096: stdout chunk (state=3): >>> # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204511.00113: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9625adfd0> <<< 46400 1727204511.00165: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627d6820><<< 46400 1727204511.00177: stdout chunk (state=3): >>> <<< 46400 1727204511.00209: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962545d60><<< 46400 1727204511.00223: stdout chunk (state=3): >>> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py<<< 46400 1727204511.00236: stdout chunk (state=3): >>> <<< 46400 1727204511.00265: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 46400 1727204511.00290: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00315: stdout chunk (state=3): >>> <<< 46400 1727204511.00328: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.00348: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py<<< 46400 1727204511.00358: stdout chunk (state=3): >>> <<< 46400 1727204511.00384: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00395: stdout chunk (state=3): >>> <<< 46400 1727204511.00470: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00486: stdout chunk (state=3): >>> <<< 46400 1727204511.00552: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py<<< 46400 1727204511.00565: stdout chunk (state=3): >>> <<< 46400 1727204511.00597: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00608: stdout chunk (state=3): >>> <<< 46400 1727204511.00678: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00689: stdout chunk (state=3): >>> <<< 46400 1727204511.00758: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py<<< 46400 1727204511.00772: stdout chunk (state=3): >>> <<< 46400 1727204511.00787: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00798: stdout chunk (state=3): >>> <<< 46400 1727204511.00827: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.00846: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py<<< 46400 1727204511.00852: stdout chunk (state=3): >>> <<< 46400 1727204511.00877: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00888: stdout chunk (state=3): >>> <<< 46400 1727204511.00925: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.00937: stdout chunk (state=3): >>> <<< 46400 1727204511.00980: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py<<< 46400 1727204511.00991: stdout chunk (state=3): >>> <<< 46400 1727204511.01014: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.01095: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.01159: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py<<< 46400 1727204511.01183: stdout chunk (state=3): >>> <<< 46400 1727204511.01199: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.01261: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01277: stdout chunk (state=3): >>> <<< 46400 1727204511.01355: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py<<< 46400 1727204511.01370: stdout chunk (state=3): >>> <<< 46400 1727204511.01387: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01407: stdout chunk (state=3): >>> <<< 46400 1727204511.01482: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01492: stdout chunk (state=3): >>> <<< 46400 1727204511.01570: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01580: stdout chunk (state=3): >>> <<< 46400 1727204511.01670: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01673: stdout chunk (state=3): >>> <<< 46400 1727204511.01749: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py<<< 46400 1727204511.01753: stdout chunk (state=3): >>> <<< 46400 1727204511.01771: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 46400 1727204511.01802: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.01805: stdout chunk (state=3): >>> <<< 46400 1727204511.02466: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03119: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 46400 1727204511.03195: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03269: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03328: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03367: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 46400 1727204511.03370: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 46400 1727204511.03373: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03399: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03428: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 46400 1727204511.03441: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03488: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03541: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 46400 1727204511.03545: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03577: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03609: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 46400 1727204511.03612: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03637: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03669: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 46400 1727204511.03678: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03742: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.03823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 46400 1727204511.03826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 46400 1727204511.03840: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962495e80> <<< 46400 1727204511.03866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 46400 1727204511.03890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 46400 1727204511.04047: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624959d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 46400 1727204511.04068: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04116: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04188: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 46400 1727204511.04259: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04336: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 46400 1727204511.04353: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04404: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04477: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 46400 1727204511.04524: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.04560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 46400 1727204511.04586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 46400 1727204511.04739: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204511.04742: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96250b490> <<< 46400 1727204511.04985: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a3850> <<< 46400 1727204511.04988: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 46400 1727204511.05036: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.05093: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 46400 1727204511.05096: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.05163: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.05236: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.05338: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.05483: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 46400 1727204511.05486: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 46400 1727204511.05853: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962508670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962508220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 46400 1727204511.06011: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 46400 1727204511.06436: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 46400 1727204511.06474: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.06597: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.06645: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.06711: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 46400 1727204511.06714: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 46400 1727204511.06814: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.06846: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204511.06913: stdout chunk (state=3): >>> <<< 46400 1727204511.07099: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.07269: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 46400 1727204511.07284: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 46400 1727204511.07287: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.07445: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.07596: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 46400 1727204511.07608: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.07645: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.07685: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.08348: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.08943: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 46400 1727204511.08951: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.09178: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 46400 1727204511.09261: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.09266: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 46400 1727204511.09367: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.09729: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 46400 1727204511.09778: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available <<< 46400 1727204511.09813: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.09988: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10172: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 46400 1727204511.10177: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 46400 1727204511.10181: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10202: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10233: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 46400 1727204511.10236: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10255: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10285: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 46400 1727204511.10288: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10346: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10402: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 46400 1727204511.10427: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10442: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10471: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 46400 1727204511.10474: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10523: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10577: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 46400 1727204511.10580: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10620: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10678: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 46400 1727204511.10682: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.10886: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11109: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 46400 1727204511.11112: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11157: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11209: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 46400 1727204511.11255: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11273: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 46400 1727204511.11303: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11330: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11344: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 46400 1727204511.11378: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11419: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 46400 1727204511.11422: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11483: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11553: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 46400 1727204511.11579: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 46400 1727204511.11592: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11644: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11662: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 46400 1727204511.11698: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11702: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11714: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11757: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11796: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.11852: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.12266: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 46400 1727204511.12423: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.12626: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available <<< 46400 1727204511.12689: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 46400 1727204511.12692: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204511.13403: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available <<< 46400 1727204511.14383: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 46400 1727204511.14398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 46400 1727204511.14417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 46400 1727204511.14461: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96244c550> <<< 46400 1727204511.14466: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631a4c10> <<< 46400 1727204511.14561: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a2af0> <<< 46400 1727204511.14898: stdout chunk (state=3): >>>import 'gc' # <<< 46400 1727204511.16729: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 46400 1727204511.16758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 46400 1727204511.16761: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a2f40> <<< 46400 1727204511.16785: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 46400 1727204511.16809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 46400 1727204511.16848: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a3250> <<< 46400 1727204511.16907: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204511.16945: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 46400 1727204511.16958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9622a9040> <<< 46400 1727204511.16966: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9622dbeb0> <<< 46400 1727204511.17409: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 46400 1727204511.17419: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 46400 1727204511.17438: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 46400 1727204511.17459: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 46400 1727204511.41927: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2789, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 743, "free": 2789}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 874, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266829824, "block_size": 4096, "block_total": 65519355, "block_available": 64518269, "block_used": 1001086, "inode_total": 131071472, "inode_available": 130998223, "inode_used": 73249, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "of<<< 46400 1727204511.41958: stdout chunk (state=3): >>>f [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public<<< 46400 1727204511.41972: stdout chunk (state=3): >>>_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "51", "epoch": "1727204511", "epoch_int": "1727204511", "date": "2024-09-24", "time": "15:01:51", "iso8601_micro": "2024-09-24T19:01:51.415952Z", "iso8601": "2024-09-24T19:01:51Z", "iso8601_basic": "20240924T150151415952", "iso8601_basic_short": "20240924T150151", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.42, "5m": 0.55, "15m": 0.33}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 46400 1727204511.42495: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin<<< 46400 1727204511.42513: stdout chunk (state=3): >>> # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct<<< 46400 1727204511.42810: stdout chunk (state=3): >>> # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select <<< 46400 1727204511.42900: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 46400 1727204511.43170: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 46400 1727204511.43197: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 46400 1727204511.43223: stdout chunk (state=3): >>># destroy zipimport <<< 46400 1727204511.43232: stdout chunk (state=3): >>># destroy _compression <<< 46400 1727204511.43251: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 46400 1727204511.43285: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 46400 1727204511.43311: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 46400 1727204511.43348: stdout chunk (state=3): >>># destroy selinux <<< 46400 1727204511.43354: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 46400 1727204511.43402: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 46400 1727204511.43417: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 46400 1727204511.43433: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 46400 1727204511.43462: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 46400 1727204511.43501: stdout chunk (state=3): >>># destroy base64 <<< 46400 1727204511.43528: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 46400 1727204511.43548: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 46400 1727204511.43583: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna <<< 46400 1727204511.43599: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 46400 1727204511.43616: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 46400 1727204511.43634: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess <<< 46400 1727204511.43651: stdout chunk (state=3): >>># cleanup[3] wiping selectors <<< 46400 1727204511.43669: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma <<< 46400 1727204511.43686: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 46400 1727204511.43701: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 46400 1727204511.43722: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 46400 1727204511.43741: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 46400 1727204511.43758: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 46400 1727204511.43774: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 46400 1727204511.43788: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 46400 1727204511.43813: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl <<< 46400 1727204511.43829: stdout chunk (state=3): >>># destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 46400 1727204511.43988: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 46400 1727204511.44002: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq <<< 46400 1727204511.44020: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 46400 1727204511.44035: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 46400 1727204511.44061: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 46400 1727204511.44077: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 46400 1727204511.44115: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 46400 1727204511.44499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204511.44508: stdout chunk (state=3): >>><<< 46400 1727204511.44524: stderr chunk (state=3): >>><<< 46400 1727204511.44699: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9638d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963857850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963834940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963895880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96382dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963857d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96387d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d4f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d90a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635cc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d36a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635d43d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9634bae20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634ba910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634baf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634bafd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cb0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635aed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635a7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635ba6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635dae80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9634cbcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635ae2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9635ba2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635e0a30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbeb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cbd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96349e3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96349e4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634d3f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cda90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cd490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633d2220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963489520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cdf10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9635e00a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633e4b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633e4e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96338e400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633e4f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96339f2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633f5610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96339f3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cba30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bb7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bb8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633bbd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9633c6250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bb940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633aea90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9634cb610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9633bbaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff9632de6d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96321c160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321cf70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321cd90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96321cfd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321c100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631f10d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962bc8340> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962bc8040> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962bc8ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963204dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9632043a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963204fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963251d30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631cfb20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963203520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963203550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963164fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963263250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963161850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9632633d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963263ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631617f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9631fcc10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963263fa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963263550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96325c910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963155940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963173d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963160580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff963155ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631609a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96319c7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631a18b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627c4970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631da730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96321e370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96318f550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962640eb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631967f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff963194790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96318fb50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96278a370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627a7a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9627a7b20> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96277a280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff96278a970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9625457f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962545b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9627ed0a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962787f70> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627ed190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff9625adfd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9627d6820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962545d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962495e80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624959d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96250b490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a3850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff962508670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff962508220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_h1oe9ri0/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff96244c550> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9631a4c10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a2af0> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a2f40> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9624a3250> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9622a9040> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff9622dbeb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2789, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 743, "free": 2789}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 874, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266829824, "block_size": 4096, "block_total": 65519355, "block_available": 64518269, "block_used": 1001086, "inode_total": 131071472, "inode_available": 130998223, "inode_used": 73249, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "51", "epoch": "1727204511", "epoch_int": "1727204511", "date": "2024-09-24", "time": "15:01:51", "iso8601_micro": "2024-09-24T19:01:51.415952Z", "iso8601": "2024-09-24T19:01:51Z", "iso8601_basic": "20240924T150151415952", "iso8601_basic_short": "20240924T150151", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.42, "5m": 0.55, "15m": 0.33}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 46400 1727204511.46053: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204511.46129: _low_level_execute_command(): starting 46400 1727204511.46132: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204509.787712-46487-192195880851373/ > /dev/null 2>&1 && sleep 0' 46400 1727204511.46786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204511.46805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.46827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.46845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.46890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.46903: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204511.46927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.46945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204511.46957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204511.46969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204511.46980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.46992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.47009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.47023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.47039: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204511.47052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.47135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.47168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.47184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204511.47270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204511.49388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204511.49435: stderr chunk (state=3): >>><<< 46400 1727204511.49438: stdout chunk (state=3): >>><<< 46400 1727204511.49451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204511.49458: handler run complete 46400 1727204511.49540: variable 'ansible_facts' from source: unknown 46400 1727204511.49603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.49802: variable 'ansible_facts' from source: unknown 46400 1727204511.49854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.49932: attempt loop complete, returning result 46400 1727204511.49936: _execute() done 46400 1727204511.49938: dumping result to json 46400 1727204511.49959: done dumping result, returning 46400 1727204511.49970: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-1303-fda8-00000000001b] 46400 1727204511.49972: sending task result for task 0affcd87-79f5-1303-fda8-00000000001b 46400 1727204511.50224: done sending task result for task 0affcd87-79f5-1303-fda8-00000000001b 46400 1727204511.50227: WORKER PROCESS EXITING ok: [managed-node2] 46400 1727204511.50665: no more pending results, returning what we have 46400 1727204511.50668: results queue empty 46400 1727204511.50669: checking for any_errors_fatal 46400 1727204511.50671: done checking for any_errors_fatal 46400 1727204511.50671: checking for max_fail_percentage 46400 1727204511.50673: done checking for max_fail_percentage 46400 1727204511.50674: checking to see if all hosts have failed and the running result is not ok 46400 1727204511.50674: done checking to see if all hosts have failed 46400 1727204511.50675: getting the remaining hosts for this loop 46400 1727204511.50677: done getting the remaining hosts for this loop 46400 1727204511.50681: getting the next task for host managed-node2 46400 1727204511.50687: done getting next task for host managed-node2 46400 1727204511.50689: ^ task is: TASK: meta (flush_handlers) 46400 1727204511.50691: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204511.50694: getting variables 46400 1727204511.50695: in VariableManager get_vars() 46400 1727204511.50716: Calling all_inventory to load vars for managed-node2 46400 1727204511.50719: Calling groups_inventory to load vars for managed-node2 46400 1727204511.50722: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.50735: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.50738: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.50743: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.50932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.51146: done with get_vars() 46400 1727204511.51156: done getting variables 46400 1727204511.51228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 46400 1727204511.51297: in VariableManager get_vars() 46400 1727204511.51310: Calling all_inventory to load vars for managed-node2 46400 1727204511.51313: Calling groups_inventory to load vars for managed-node2 46400 1727204511.51315: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.51319: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.51328: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.51331: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.51457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.51644: done with get_vars() 46400 1727204511.51659: done queuing things up, now waiting for results queue to drain 46400 1727204511.51661: results queue empty 46400 1727204511.51668: checking for any_errors_fatal 46400 1727204511.51670: done checking for any_errors_fatal 46400 1727204511.51671: checking for max_fail_percentage 46400 1727204511.51676: done checking for max_fail_percentage 46400 1727204511.51677: checking to see if all hosts have failed and the running result is not ok 46400 1727204511.51678: done checking to see if all hosts have failed 46400 1727204511.51678: getting the remaining hosts for this loop 46400 1727204511.51679: done getting the remaining hosts for this loop 46400 1727204511.51682: getting the next task for host managed-node2 46400 1727204511.51687: done getting next task for host managed-node2 46400 1727204511.51690: ^ task is: TASK: Include the task 'el_repo_setup.yml' 46400 1727204511.51691: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204511.51693: getting variables 46400 1727204511.51694: in VariableManager get_vars() 46400 1727204511.51702: Calling all_inventory to load vars for managed-node2 46400 1727204511.51704: Calling groups_inventory to load vars for managed-node2 46400 1727204511.51706: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.51710: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.51712: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.51715: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.51853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.52052: done with get_vars() 46400 1727204511.52061: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Tuesday 24 September 2024 15:01:51 -0400 (0:00:01.785) 0:00:01.806 ***** 46400 1727204511.52149: entering _queue_task() for managed-node2/include_tasks 46400 1727204511.52151: Creating lock for include_tasks 46400 1727204511.52517: worker is 1 (out of 1 available) 46400 1727204511.52532: exiting _queue_task() for managed-node2/include_tasks 46400 1727204511.52555: done queuing things up, now waiting for results queue to drain 46400 1727204511.52557: waiting for pending results... 46400 1727204511.52720: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 46400 1727204511.52778: in run() - task 0affcd87-79f5-1303-fda8-000000000006 46400 1727204511.52788: variable 'ansible_search_path' from source: unknown 46400 1727204511.52815: calling self._execute() 46400 1727204511.52875: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204511.52879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204511.52886: variable 'omit' from source: magic vars 46400 1727204511.52958: _execute() done 46400 1727204511.52966: dumping result to json 46400 1727204511.52970: done dumping result, returning 46400 1727204511.52972: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-1303-fda8-000000000006] 46400 1727204511.52978: sending task result for task 0affcd87-79f5-1303-fda8-000000000006 46400 1727204511.53067: done sending task result for task 0affcd87-79f5-1303-fda8-000000000006 46400 1727204511.53070: WORKER PROCESS EXITING 46400 1727204511.53142: no more pending results, returning what we have 46400 1727204511.53146: in VariableManager get_vars() 46400 1727204511.53175: Calling all_inventory to load vars for managed-node2 46400 1727204511.53178: Calling groups_inventory to load vars for managed-node2 46400 1727204511.53180: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.53190: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.53196: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.53199: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.53431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.53535: done with get_vars() 46400 1727204511.53540: variable 'ansible_search_path' from source: unknown 46400 1727204511.53550: we have included files to process 46400 1727204511.53551: generating all_blocks data 46400 1727204511.53552: done generating all_blocks data 46400 1727204511.53552: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 46400 1727204511.53553: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 46400 1727204511.53555: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 46400 1727204511.53998: in VariableManager get_vars() 46400 1727204511.54009: done with get_vars() 46400 1727204511.54016: done processing included file 46400 1727204511.54017: iterating over new_blocks loaded from include file 46400 1727204511.54018: in VariableManager get_vars() 46400 1727204511.54024: done with get_vars() 46400 1727204511.54025: filtering new block on tags 46400 1727204511.54034: done filtering new block on tags 46400 1727204511.54036: in VariableManager get_vars() 46400 1727204511.54043: done with get_vars() 46400 1727204511.54044: filtering new block on tags 46400 1727204511.54053: done filtering new block on tags 46400 1727204511.54054: in VariableManager get_vars() 46400 1727204511.54063: done with get_vars() 46400 1727204511.54069: filtering new block on tags 46400 1727204511.54080: done filtering new block on tags 46400 1727204511.54081: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 46400 1727204511.54085: extending task lists for all hosts with included blocks 46400 1727204511.54115: done extending task lists 46400 1727204511.54115: done processing included files 46400 1727204511.54116: results queue empty 46400 1727204511.54116: checking for any_errors_fatal 46400 1727204511.54117: done checking for any_errors_fatal 46400 1727204511.54118: checking for max_fail_percentage 46400 1727204511.54118: done checking for max_fail_percentage 46400 1727204511.54119: checking to see if all hosts have failed and the running result is not ok 46400 1727204511.54119: done checking to see if all hosts have failed 46400 1727204511.54120: getting the remaining hosts for this loop 46400 1727204511.54120: done getting the remaining hosts for this loop 46400 1727204511.54122: getting the next task for host managed-node2 46400 1727204511.54125: done getting next task for host managed-node2 46400 1727204511.54126: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 46400 1727204511.54127: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204511.54129: getting variables 46400 1727204511.54129: in VariableManager get_vars() 46400 1727204511.54135: Calling all_inventory to load vars for managed-node2 46400 1727204511.54136: Calling groups_inventory to load vars for managed-node2 46400 1727204511.54138: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.54141: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.54142: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.54144: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.56225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.56421: done with get_vars() 46400 1727204511.56430: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.043) 0:00:01.849 ***** 46400 1727204511.56509: entering _queue_task() for managed-node2/setup 46400 1727204511.56821: worker is 1 (out of 1 available) 46400 1727204511.56834: exiting _queue_task() for managed-node2/setup 46400 1727204511.56855: done queuing things up, now waiting for results queue to drain 46400 1727204511.56857: waiting for pending results... 46400 1727204511.57200: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 46400 1727204511.57267: in run() - task 0affcd87-79f5-1303-fda8-00000000002c 46400 1727204511.57283: variable 'ansible_search_path' from source: unknown 46400 1727204511.57286: variable 'ansible_search_path' from source: unknown 46400 1727204511.57316: calling self._execute() 46400 1727204511.57371: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204511.57375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204511.57384: variable 'omit' from source: magic vars 46400 1727204511.57741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204511.59441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204511.59533: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204511.59587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204511.59626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204511.59659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204511.59749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204511.59786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204511.59816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204511.59859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204511.59879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204511.60050: variable 'ansible_facts' from source: unknown 46400 1727204511.60127: variable 'network_test_required_facts' from source: task vars 46400 1727204511.60172: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 46400 1727204511.60181: when evaluation is False, skipping this task 46400 1727204511.60192: _execute() done 46400 1727204511.60199: dumping result to json 46400 1727204511.60205: done dumping result, returning 46400 1727204511.60218: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-1303-fda8-00000000002c] 46400 1727204511.60229: sending task result for task 0affcd87-79f5-1303-fda8-00000000002c skipping: [managed-node2] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 46400 1727204511.60409: no more pending results, returning what we have 46400 1727204511.60413: results queue empty 46400 1727204511.60414: checking for any_errors_fatal 46400 1727204511.60415: done checking for any_errors_fatal 46400 1727204511.60416: checking for max_fail_percentage 46400 1727204511.60418: done checking for max_fail_percentage 46400 1727204511.60418: checking to see if all hosts have failed and the running result is not ok 46400 1727204511.60419: done checking to see if all hosts have failed 46400 1727204511.60420: getting the remaining hosts for this loop 46400 1727204511.60421: done getting the remaining hosts for this loop 46400 1727204511.60425: getting the next task for host managed-node2 46400 1727204511.60434: done getting next task for host managed-node2 46400 1727204511.60437: ^ task is: TASK: Check if system is ostree 46400 1727204511.60440: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204511.60442: getting variables 46400 1727204511.60444: in VariableManager get_vars() 46400 1727204511.60475: Calling all_inventory to load vars for managed-node2 46400 1727204511.60478: Calling groups_inventory to load vars for managed-node2 46400 1727204511.60481: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204511.60492: Calling all_plugins_play to load vars for managed-node2 46400 1727204511.60494: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204511.60497: Calling groups_plugins_play to load vars for managed-node2 46400 1727204511.60673: done sending task result for task 0affcd87-79f5-1303-fda8-00000000002c 46400 1727204511.60677: WORKER PROCESS EXITING 46400 1727204511.60700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204511.60905: done with get_vars() 46400 1727204511.60915: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.044) 0:00:01.894 ***** 46400 1727204511.61008: entering _queue_task() for managed-node2/stat 46400 1727204511.61259: worker is 1 (out of 1 available) 46400 1727204511.61274: exiting _queue_task() for managed-node2/stat 46400 1727204511.61285: done queuing things up, now waiting for results queue to drain 46400 1727204511.61286: waiting for pending results... 46400 1727204511.61514: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 46400 1727204511.61605: in run() - task 0affcd87-79f5-1303-fda8-00000000002e 46400 1727204511.61622: variable 'ansible_search_path' from source: unknown 46400 1727204511.61633: variable 'ansible_search_path' from source: unknown 46400 1727204511.61676: calling self._execute() 46400 1727204511.61751: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204511.61762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204511.61777: variable 'omit' from source: magic vars 46400 1727204511.62229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204511.62478: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204511.62549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204511.62592: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204511.62631: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204511.62722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204511.62751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204511.62784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204511.62813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204511.62943: Evaluated conditional (not __network_is_ostree is defined): True 46400 1727204511.62960: variable 'omit' from source: magic vars 46400 1727204511.63002: variable 'omit' from source: magic vars 46400 1727204511.63045: variable 'omit' from source: magic vars 46400 1727204511.63075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204511.63105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204511.63127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204511.63152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204511.63168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204511.63202: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204511.63210: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204511.63217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204511.63316: Set connection var ansible_shell_type to sh 46400 1727204511.63330: Set connection var ansible_shell_executable to /bin/sh 46400 1727204511.63339: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204511.63347: Set connection var ansible_connection to ssh 46400 1727204511.63357: Set connection var ansible_pipelining to False 46400 1727204511.63372: Set connection var ansible_timeout to 10 46400 1727204511.63401: variable 'ansible_shell_executable' from source: unknown 46400 1727204511.63409: variable 'ansible_connection' from source: unknown 46400 1727204511.63415: variable 'ansible_module_compression' from source: unknown 46400 1727204511.63421: variable 'ansible_shell_type' from source: unknown 46400 1727204511.63427: variable 'ansible_shell_executable' from source: unknown 46400 1727204511.63433: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204511.63439: variable 'ansible_pipelining' from source: unknown 46400 1727204511.63445: variable 'ansible_timeout' from source: unknown 46400 1727204511.63451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204511.63600: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204511.63616: variable 'omit' from source: magic vars 46400 1727204511.63626: starting attempt loop 46400 1727204511.63633: running the handler 46400 1727204511.63650: _low_level_execute_command(): starting 46400 1727204511.63665: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204511.64418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204511.64434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.64452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.64474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.64518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.64532: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204511.64546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.64569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204511.64582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204511.64593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204511.64604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.64618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.64632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.64642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.64653: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204511.64667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.64747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.64773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.64794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204511.64882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 46400 1727204511.67096: stdout chunk (state=3): >>>/root <<< 46400 1727204511.67340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204511.67343: stdout chunk (state=3): >>><<< 46400 1727204511.67347: stderr chunk (state=3): >>><<< 46400 1727204511.67476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 46400 1727204511.67487: _low_level_execute_command(): starting 46400 1727204511.67490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115 `" && echo ansible-tmp-1727204511.6737523-46556-250492809790115="` echo /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115 `" ) && sleep 0' 46400 1727204511.68110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204511.68131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.68148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.68172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.68215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.68228: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204511.68246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.68271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204511.68277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204511.68285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204511.68293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.68302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.68313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.68320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.68327: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204511.68335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.68414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.68431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.68443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204511.68516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 46400 1727204511.71088: stdout chunk (state=3): >>>ansible-tmp-1727204511.6737523-46556-250492809790115=/root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115 <<< 46400 1727204511.71391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204511.71501: stderr chunk (state=3): >>><<< 46400 1727204511.71505: stdout chunk (state=3): >>><<< 46400 1727204511.71570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204511.6737523-46556-250492809790115=/root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 46400 1727204511.71770: variable 'ansible_module_compression' from source: unknown 46400 1727204511.71773: ANSIBALLZ: Using lock for stat 46400 1727204511.71776: ANSIBALLZ: Acquiring lock 46400 1727204511.71778: ANSIBALLZ: Lock acquired: 140519374126064 46400 1727204511.71780: ANSIBALLZ: Creating module 46400 1727204511.88498: ANSIBALLZ: Writing module into payload 46400 1727204511.88777: ANSIBALLZ: Writing module 46400 1727204511.88814: ANSIBALLZ: Renaming module 46400 1727204511.88909: ANSIBALLZ: Done creating module 46400 1727204511.88930: variable 'ansible_facts' from source: unknown 46400 1727204511.89081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/AnsiballZ_stat.py 46400 1727204511.89906: Sending initial data 46400 1727204511.89916: Sent initial data (153 bytes) 46400 1727204511.92264: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.92416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.92421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.92481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.92578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.92601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204511.92814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204511.94471: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204511.94503: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204511.94543: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmphaxoka5f /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/AnsiballZ_stat.py <<< 46400 1727204511.94584: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204511.95872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204511.96018: stderr chunk (state=3): >>><<< 46400 1727204511.96021: stdout chunk (state=3): >>><<< 46400 1727204511.96024: done transferring module to remote 46400 1727204511.96030: _low_level_execute_command(): starting 46400 1727204511.96032: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/ /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/AnsiballZ_stat.py && sleep 0' 46400 1727204511.96635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204511.96649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.96669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.96688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.96732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.96743: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204511.96757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.96780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204511.96791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204511.96801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204511.96813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.96826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.96841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.96852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.96867: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204511.96881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.96948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.96970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.96985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204511.97068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204511.98786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204511.98890: stderr chunk (state=3): >>><<< 46400 1727204511.98894: stdout chunk (state=3): >>><<< 46400 1727204511.99002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204511.99006: _low_level_execute_command(): starting 46400 1727204511.99009: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/AnsiballZ_stat.py && sleep 0' 46400 1727204511.99602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204511.99618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.99634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.99654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.99701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.99713: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204511.99727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.99745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204511.99757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204511.99773: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204511.99787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204511.99800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204511.99816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204511.99828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204511.99838: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204511.99853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204511.99926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204511.99944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204511.99958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.00044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.01984: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 46400 1727204512.02028: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 46400 1727204512.02069: stdout chunk (state=3): >>>import 'posix' # <<< 46400 1727204512.02101: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 46400 1727204512.02137: stdout chunk (state=3): >>>import 'time' # <<< 46400 1727204512.02151: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 46400 1727204512.02218: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 46400 1727204512.02224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204512.02244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 46400 1727204512.02248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 46400 1727204512.02259: stdout chunk (state=3): >>>import '_codecs' # <<< 46400 1727204512.02279: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698dc0> <<< 46400 1727204512.02322: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 46400 1727204512.02328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 46400 1727204512.02341: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698b20> <<< 46400 1727204512.02355: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 46400 1727204512.02382: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698ac0> <<< 46400 1727204512.02394: stdout chunk (state=3): >>>import '_signal' # <<< 46400 1727204512.02420: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 46400 1727204512.02425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d490> <<< 46400 1727204512.02453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 46400 1727204512.02479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 46400 1727204512.02492: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d940> <<< 46400 1727204512.02516: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d670> <<< 46400 1727204512.02558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 46400 1727204512.02561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 46400 1727204512.02578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 46400 1727204512.02594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 46400 1727204512.02607: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 46400 1727204512.02630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 46400 1727204512.02660: stdout chunk (state=3): >>>import '_stat' # <<< 46400 1727204512.02664: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf190> <<< 46400 1727204512.02693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 46400 1727204512.02696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 46400 1727204512.02761: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf220> <<< 46400 1727204512.02793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 46400 1727204512.02828: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 46400 1727204512.02831: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3f2850> <<< 46400 1727204512.02833: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf940> <<< 46400 1727204512.02860: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f655880> <<< 46400 1727204512.02878: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 46400 1727204512.02898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3c8d90> <<< 46400 1727204512.02947: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 46400 1727204512.02950: stdout chunk (state=3): >>>import '_locale' # <<< 46400 1727204512.02953: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3f2d90> <<< 46400 1727204512.03014: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d970> <<< 46400 1727204512.03033: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 46400 1727204512.03244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 46400 1727204512.03250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 46400 1727204512.03272: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 46400 1727204512.03277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 46400 1727204512.03289: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 46400 1727204512.03310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 46400 1727204512.03338: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 46400 1727204512.03352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f392f10> <<< 46400 1727204512.03395: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3990a0> <<< 46400 1727204512.03417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 46400 1727204512.03455: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 46400 1727204512.03476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 46400 1727204512.03524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 46400 1727204512.03527: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f38c5b0> <<< 46400 1727204512.03550: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3936a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3923d0> <<< 46400 1727204512.03576: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 46400 1727204512.03636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 46400 1727204512.03672: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 46400 1727204512.03685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204512.03707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 46400 1727204512.03752: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.03766: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f316e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316970> <<< 46400 1727204512.03778: stdout chunk (state=3): >>>import 'itertools' # <<< 46400 1727204512.03792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316f70> <<< 46400 1727204512.03814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 46400 1727204512.03835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 46400 1727204512.03844: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316dc0> <<< 46400 1727204512.03884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326130> <<< 46400 1727204512.03888: stdout chunk (state=3): >>>import '_collections' # <<< 46400 1727204512.03929: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f36edf0> <<< 46400 1727204512.03941: stdout chunk (state=3): >>>import '_functools' # <<< 46400 1727204512.03961: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3676d0> <<< 46400 1727204512.04006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 46400 1727204512.04028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f37a730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f39ae80> <<< 46400 1727204512.04049: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py <<< 46400 1727204512.04063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 46400 1727204512.04081: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f326d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f36e310> <<< 46400 1727204512.04115: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.04130: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f37a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3a0a30> <<< 46400 1727204512.04150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 46400 1727204512.04161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 46400 1727204512.04183: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204512.04222: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 46400 1727204512.04229: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326e50> <<< 46400 1727204512.04254: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326dc0> <<< 46400 1727204512.04279: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 46400 1727204512.04290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 46400 1727204512.04311: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 46400 1727204512.04329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 46400 1727204512.04380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 46400 1727204512.04403: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 46400 1727204512.04431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2fa430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 46400 1727204512.04448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 46400 1727204512.04468: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2fa520> <<< 46400 1727204512.04586: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f32ffa0> <<< 46400 1727204512.04617: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f329af0> <<< 46400 1727204512.04642: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3294c0> <<< 46400 1727204512.04662: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 46400 1727204512.04699: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 46400 1727204512.04712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 46400 1727204512.04743: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f22e280> <<< 46400 1727204512.04780: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2e5dc0> <<< 46400 1727204512.04821: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f329f70> <<< 46400 1727204512.04833: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3a00a0> <<< 46400 1727204512.04862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 46400 1727204512.04877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 46400 1727204512.04890: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 46400 1727204512.04904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f23fbb0> <<< 46400 1727204512.04916: stdout chunk (state=3): >>>import 'errno' # <<< 46400 1727204512.04951: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f23fee0> <<< 46400 1727204512.04976: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 46400 1727204512.04995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 46400 1727204512.05008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2517f0> <<< 46400 1727204512.05028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 46400 1727204512.05063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 46400 1727204512.05091: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f251d30> <<< 46400 1727204512.05125: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.05138: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1df460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f23ffd0> <<< 46400 1727204512.05152: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 46400 1727204512.05161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 46400 1727204512.05210: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1ef340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f251670> <<< 46400 1727204512.05221: stdout chunk (state=3): >>>import 'pwd' # <<< 46400 1727204512.05250: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1ef400> <<< 46400 1727204512.05287: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326a90> <<< 46400 1727204512.05309: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 46400 1727204512.05320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 46400 1727204512.05344: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 46400 1727204512.05354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 46400 1727204512.05388: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.05408: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20b760> <<< 46400 1727204512.05424: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 46400 1727204512.05438: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20ba30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20b820> <<< 46400 1727204512.05461: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20b910> <<< 46400 1727204512.05495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 46400 1727204512.05511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 46400 1727204512.05679: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.05691: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20bd60> <<< 46400 1727204512.05717: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f2152b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20b9a0> <<< 46400 1727204512.05747: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f1ffaf0> <<< 46400 1727204512.05764: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326670> <<< 46400 1727204512.05786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 46400 1727204512.05846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 46400 1727204512.05875: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20bb50> <<< 46400 1727204512.05962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 46400 1727204512.05984: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f077ebe6730> <<< 46400 1727204512.06130: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip' <<< 46400 1727204512.06141: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.06225: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.06257: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/__init__.py <<< 46400 1727204512.06273: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.06292: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 46400 1727204512.06314: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.07558: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.08683: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 46400 1727204512.08688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 46400 1727204512.08690: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d880> <<< 46400 1727204512.08727: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 46400 1727204512.08739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 46400 1727204512.08780: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 46400 1727204512.08794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 46400 1727204512.08826: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 46400 1727204512.08836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 46400 1727204512.08876: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.08890: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.08905: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb0d160> <<< 46400 1727204512.08968: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d280> <<< 46400 1727204512.09018: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0dfd0> <<< 46400 1727204512.09153: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0ddf0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.09179: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.09181: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb0d580> <<< 46400 1727204512.09217: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 46400 1727204512.09261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 46400 1727204512.09325: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d100> <<< 46400 1727204512.09358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 46400 1727204512.09392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 46400 1727204512.09427: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 46400 1727204512.09463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 46400 1727204512.09495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 46400 1727204512.09513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 46400 1727204512.09659: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea64fa0> <<< 46400 1727204512.09719: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.09724: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.09738: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea82c70> <<< 46400 1727204512.09777: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.09796: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea82f70> <<< 46400 1727204512.09833: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 46400 1727204512.09881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 46400 1727204512.09931: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea82310> <<< 46400 1727204512.09962: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb75dc0> <<< 46400 1727204512.10253: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb753a0> <<< 46400 1727204512.10294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 46400 1727204512.10309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 46400 1727204512.10340: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb75f40> <<< 46400 1727204512.10374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 46400 1727204512.10399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 46400 1727204512.10429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 46400 1727204512.10444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 46400 1727204512.10477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 46400 1727204512.10503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 46400 1727204512.10536: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 46400 1727204512.10549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 46400 1727204512.10572: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb44e80> <<< 46400 1727204512.10687: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae0d90> <<< 46400 1727204512.10699: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae0460> <<< 46400 1727204512.10719: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb17550> <<< 46400 1727204512.10758: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.10766: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.10783: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eae0580> <<< 46400 1727204512.10824: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc'<<< 46400 1727204512.10837: stdout chunk (state=3): >>> <<< 46400 1727204512.10850: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae05b0> <<< 46400 1727204512.10888: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 46400 1727204512.10897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 46400 1727204512.10927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 46400 1727204512.10945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 46400 1727204512.11025: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.11032: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea55f70> <<< 46400 1727204512.11044: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb552b0> <<< 46400 1727204512.11051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 46400 1727204512.11074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 46400 1727204512.11298: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb55430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 46400 1727204512.11377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb6de80> <<< 46400 1727204512.11544: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea52790> <<< 46400 1727204512.11643: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea525e0> <<< 46400 1727204512.11737: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea51550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea51490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb4e970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 46400 1727204512.11759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead66a0> <<< 46400 1727204512.11986: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 46400 1727204512.11994: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead5b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae60a0> <<< 46400 1727204512.12047: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb19be0> <<< 46400 1727204512.12078: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 46400 1727204512.12146: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.12231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 46400 1727204512.12249: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 46400 1727204512.12266: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 46400 1727204512.12289: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.12385: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.12473: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.13232: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.13716: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py<<< 46400 1727204512.13720: stdout chunk (state=3): >>> <<< 46400 1727204512.13747: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 46400 1727204512.13753: stdout chunk (state=3): >>> <<< 46400 1727204512.13788: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 46400 1727204512.13799: stdout chunk (state=3): >>> <<< 46400 1727204512.13802: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py<<< 46400 1727204512.13804: stdout chunk (state=3): >>> <<< 46400 1727204512.13841: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 46400 1727204512.13847: stdout chunk (state=3): >>> <<< 46400 1727204512.13883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 46400 1727204512.13886: stdout chunk (state=3): >>> <<< 46400 1727204512.13971: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204512.13975: stdout chunk (state=3): >>> <<< 46400 1727204512.13992: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 46400 1727204512.14008: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea22ac0> <<< 46400 1727204512.14115: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py<<< 46400 1727204512.14120: stdout chunk (state=3): >>> <<< 46400 1727204512.14135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 46400 1727204512.14162: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead3d00><<< 46400 1727204512.14168: stdout chunk (state=3): >>> <<< 46400 1727204512.14196: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eaca850><<< 46400 1727204512.14201: stdout chunk (state=3): >>> <<< 46400 1727204512.14267: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py<<< 46400 1727204512.14272: stdout chunk (state=3): >>> <<< 46400 1727204512.14301: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.14306: stdout chunk (state=3): >>> <<< 46400 1727204512.14342: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.14374: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/_text.py<<< 46400 1727204512.14379: stdout chunk (state=3): >>> <<< 46400 1727204512.14407: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.14412: stdout chunk (state=3): >>> <<< 46400 1727204512.14610: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.14615: stdout chunk (state=3): >>> <<< 46400 1727204512.14825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 46400 1727204512.14838: stdout chunk (state=3): >>> <<< 46400 1727204512.14855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc'<<< 46400 1727204512.14862: stdout chunk (state=3): >>> <<< 46400 1727204512.14899: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead59d0><<< 46400 1727204512.14906: stdout chunk (state=3): >>> <<< 46400 1727204512.14932: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.14937: stdout chunk (state=3): >>> <<< 46400 1727204512.15553: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.15929: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.15988: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.16050: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 46400 1727204512.16108: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.16127: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 46400 1727204512.16567: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 46400 1727204512.16760: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.17052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 46400 1727204512.17093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 46400 1727204512.17106: stdout chunk (state=3): >>>import '_ast' # <<< 46400 1727204512.17209: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077e642310> <<< 46400 1727204512.17695: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 46400 1727204512.17751: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.17757: stdout chunk (state=3): >>> <<< 46400 1727204512.17852: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 46400 1727204512.17858: stdout chunk (state=3): >>> <<< 46400 1727204512.17903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 46400 1727204512.17914: stdout chunk (state=3): >>> <<< 46400 1727204512.18048: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb5e2b0> <<< 46400 1727204512.18102: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead37c0> <<< 46400 1727204512.18213: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 46400 1727204512.18392: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.18478: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.18528: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.18586: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 46400 1727204512.18656: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 46400 1727204512.18710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 46400 1727204512.18749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 46400 1727204512.18783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 46400 1727204512.18929: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077e629760> <<< 46400 1727204512.19002: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea13610> <<< 46400 1727204512.19130: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea12b80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 46400 1727204512.19144: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.19186: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.19256: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 46400 1727204512.19415: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 46400 1727204512.19455: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 46400 1727204512.19472: stdout chunk (state=3): >>># zipimport: zlib available <<< 46400 1727204512.19626: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.19632: stdout chunk (state=3): >>> <<< 46400 1727204512.19902: stdout chunk (state=3): >>># zipimport: zlib available<<< 46400 1727204512.19905: stdout chunk (state=3): >>> <<< 46400 1727204512.20107: stdout chunk (state=3): >>> <<< 46400 1727204512.20116: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 46400 1727204512.20149: stdout chunk (state=3): >>># destroy __main__ <<< 46400 1727204512.20539: stdout chunk (state=3): >>># clear builtins._ <<< 46400 1727204512.20544: stdout chunk (state=3): >>># clear sys.path # clear sys.argv <<< 46400 1727204512.20547: stdout chunk (state=3): >>># clear sys.ps1<<< 46400 1727204512.20556: stdout chunk (state=3): >>> <<< 46400 1727204512.20560: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_type<<< 46400 1727204512.20577: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 46400 1727204512.20624: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path<<< 46400 1727204512.20630: stdout chunk (state=3): >>> <<< 46400 1727204512.20645: stdout chunk (state=3): >>># clear sys.__interactivehook__ # restore sys.stdin<<< 46400 1727204512.20650: stdout chunk (state=3): >>> # restore sys.stdout<<< 46400 1727204512.20667: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins<<< 46400 1727204512.20691: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 46400 1727204512.20707: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 46400 1727204512.20762: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io<<< 46400 1727204512.20780: stdout chunk (state=3): >>> # cleanup[2] removing marshal <<< 46400 1727204512.20786: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 46400 1727204512.20788: stdout chunk (state=3): >>># cleanup[2] removing time <<< 46400 1727204512.20792: stdout chunk (state=3): >>># cleanup[2] removing zipimport<<< 46400 1727204512.20826: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 46400 1727204512.20836: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 46400 1727204512.20872: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc<<< 46400 1727204512.20878: stdout chunk (state=3): >>> # cleanup[2] removing abc <<< 46400 1727204512.20898: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ <<< 46400 1727204512.20917: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 46400 1727204512.20987: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath<<< 46400 1727204512.21027: stdout chunk (state=3): >>> # cleanup[2] removing os.path # cleanup[2] removing os<<< 46400 1727204512.21033: stdout chunk (state=3): >>> # cleanup[2] removing _sitebuiltins<<< 46400 1727204512.21037: stdout chunk (state=3): >>> <<< 46400 1727204512.21041: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing _bootlocale <<< 46400 1727204512.21044: stdout chunk (state=3): >>># destroy _bootlocale<<< 46400 1727204512.21101: stdout chunk (state=3): >>> <<< 46400 1727204512.21113: stdout chunk (state=3): >>># cleanup[2] removing site # destroy site<<< 46400 1727204512.21116: stdout chunk (state=3): >>> # cleanup[2] removing types<<< 46400 1727204512.21119: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants<<< 46400 1727204512.21123: stdout chunk (state=3): >>> <<< 46400 1727204512.21125: stdout chunk (state=3): >>># destroy sre_constants # cleanup[2] removing sre_parse<<< 46400 1727204512.21127: stdout chunk (state=3): >>> # cleanup[2] removing sre_compile <<< 46400 1727204512.21129: stdout chunk (state=3): >>># cleanup[2] removing _heapq <<< 46400 1727204512.21151: stdout chunk (state=3): >>># cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword <<< 46400 1727204512.21222: stdout chunk (state=3): >>># destroy keyword # cleanup[2] removing _operator<<< 46400 1727204512.21425: stdout chunk (state=3): >>> # cleanup[2] removing operator<<< 46400 1727204512.21429: stdout chunk (state=3): >>> # cleanup[2] removing reprlib <<< 46400 1727204512.21433: stdout chunk (state=3): >>># destroy reprlib <<< 46400 1727204512.21435: stdout chunk (state=3): >>># cleanup[2] removing _collections <<< 46400 1727204512.21440: stdout chunk (state=3): >>># cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib<<< 46400 1727204512.21442: stdout chunk (state=3): >>> <<< 46400 1727204512.21449: stdout chunk (state=3): >>># cleanup[2] removing typing <<< 46400 1727204512.21452: stdout chunk (state=3): >>># destroy typing <<< 46400 1727204512.21456: stdout chunk (state=3): >>># cleanup[2] removing importlib.abc <<< 46400 1727204512.21458: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset<<< 46400 1727204512.21462: stdout chunk (state=3): >>> # destroy _weakrefset<<< 46400 1727204512.21644: stdout chunk (state=3): >>> # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy<<< 46400 1727204512.21647: stdout chunk (state=3): >>> <<< 46400 1727204512.21649: stdout chunk (state=3): >>># cleanup[2] removing fnmatch <<< 46400 1727204512.21651: stdout chunk (state=3): >>># cleanup[2] removing errno <<< 46400 1727204512.21665: stdout chunk (state=3): >>># cleanup[2] removing zlib # cleanup[2] removing _compression<<< 46400 1727204512.21668: stdout chunk (state=3): >>> # cleanup[2] removing threading<<< 46400 1727204512.21670: stdout chunk (state=3): >>> # cleanup[2] removing _bz2<<< 46400 1727204512.21691: stdout chunk (state=3): >>> # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 46400 1727204512.21715: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket<<< 46400 1727204512.21736: stdout chunk (state=3): >>> # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six <<< 46400 1727204512.21775: stdout chunk (state=3): >>># destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings<<< 46400 1727204512.21808: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec<<< 46400 1727204512.21821: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 46400 1727204512.22127: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins <<< 46400 1727204512.22142: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 46400 1727204512.22210: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 46400 1727204512.22420: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 46400 1727204512.22459: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime <<< 46400 1727204512.22468: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 46400 1727204512.22766: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 46400 1727204512.22782: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 46400 1727204512.23025: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 46400 1727204512.23387: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 46400 1727204512.23391: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 46400 1727204512.23604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204512.23607: stdout chunk (state=3): >>><<< 46400 1727204512.23610: stderr chunk (state=3): >>><<< 46400 1727204512.23704: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f698ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f655880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f63d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f392f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3990a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f38c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3936a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3923d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f316e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f316dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f36edf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3676d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f37a730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f39ae80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f326d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f36e310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f37a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3a0a30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2fa430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2fa520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f32ffa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f329af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3294c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f22e280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2e5dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f329f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f3a00a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f23fbb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f23fee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f2517f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f251d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1df460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f23ffd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1ef340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f251670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f1ef400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20b760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20ba30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20b820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20b910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f20bd60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077f2152b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20b9a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f1ffaf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f326670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077f20bb50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f077ebe6730> # zipimport: found 30 names in '/tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb0d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0dfd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0ddf0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb0d580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb0d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea64fa0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea82c70> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea82f70> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea82310> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb75dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb753a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb75f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb44e80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae0d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae0460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb17550> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eae0580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae05b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea55f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb552b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb55430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb6de80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea52790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea525e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea51550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea51490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb4e970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead66a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead5b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eae60a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ead6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eb19be0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077ea22ac0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead3d00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077eaca850> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead59d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077e642310> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f077eb5e2b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ead37c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077e629760> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea13610> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f077ea12b80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload__bs_vmu7/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 46400 1727204512.24436: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204512.24445: _low_level_execute_command(): starting 46400 1727204512.24448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204511.6737523-46556-250492809790115/ > /dev/null 2>&1 && sleep 0' 46400 1727204512.25097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.25113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.25127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.25145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.25199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.25214: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.25227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.25246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.25256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.25272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.25287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.25300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.25325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.25337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.25347: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.25359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.25445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.25471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.25487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.25571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.28281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204512.28285: stdout chunk (state=3): >>><<< 46400 1727204512.28288: stderr chunk (state=3): >>><<< 46400 1727204512.28720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204512.28724: handler run complete 46400 1727204512.28726: attempt loop complete, returning result 46400 1727204512.28729: _execute() done 46400 1727204512.28731: dumping result to json 46400 1727204512.28733: done dumping result, returning 46400 1727204512.28735: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [0affcd87-79f5-1303-fda8-00000000002e] 46400 1727204512.28737: sending task result for task 0affcd87-79f5-1303-fda8-00000000002e 46400 1727204512.28809: done sending task result for task 0affcd87-79f5-1303-fda8-00000000002e 46400 1727204512.28812: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204512.28877: no more pending results, returning what we have 46400 1727204512.28881: results queue empty 46400 1727204512.28882: checking for any_errors_fatal 46400 1727204512.28887: done checking for any_errors_fatal 46400 1727204512.28888: checking for max_fail_percentage 46400 1727204512.28889: done checking for max_fail_percentage 46400 1727204512.28890: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.28891: done checking to see if all hosts have failed 46400 1727204512.28892: getting the remaining hosts for this loop 46400 1727204512.28893: done getting the remaining hosts for this loop 46400 1727204512.28896: getting the next task for host managed-node2 46400 1727204512.28901: done getting next task for host managed-node2 46400 1727204512.28904: ^ task is: TASK: Set flag to indicate system is ostree 46400 1727204512.28906: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.28909: getting variables 46400 1727204512.28910: in VariableManager get_vars() 46400 1727204512.28938: Calling all_inventory to load vars for managed-node2 46400 1727204512.28941: Calling groups_inventory to load vars for managed-node2 46400 1727204512.28945: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.28955: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.28958: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.28965: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.29134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.29328: done with get_vars() 46400 1727204512.29340: done getting variables 46400 1727204512.29437: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.684) 0:00:02.579 ***** 46400 1727204512.29477: entering _queue_task() for managed-node2/set_fact 46400 1727204512.29479: Creating lock for set_fact 46400 1727204512.29767: worker is 1 (out of 1 available) 46400 1727204512.29779: exiting _queue_task() for managed-node2/set_fact 46400 1727204512.29791: done queuing things up, now waiting for results queue to drain 46400 1727204512.29793: waiting for pending results... 46400 1727204512.30036: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 46400 1727204512.30140: in run() - task 0affcd87-79f5-1303-fda8-00000000002f 46400 1727204512.30156: variable 'ansible_search_path' from source: unknown 46400 1727204512.30168: variable 'ansible_search_path' from source: unknown 46400 1727204512.30206: calling self._execute() 46400 1727204512.30287: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.30298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.30313: variable 'omit' from source: magic vars 46400 1727204512.30858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204512.31050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204512.31091: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204512.31116: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204512.31141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204512.31214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204512.31230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204512.31249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204512.31270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204512.31362: Evaluated conditional (not __network_is_ostree is defined): True 46400 1727204512.31371: variable 'omit' from source: magic vars 46400 1727204512.31400: variable 'omit' from source: magic vars 46400 1727204512.31489: variable '__ostree_booted_stat' from source: set_fact 46400 1727204512.31527: variable 'omit' from source: magic vars 46400 1727204512.31546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204512.31571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204512.31585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204512.31598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.31606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.31631: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204512.31635: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.31637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.31703: Set connection var ansible_shell_type to sh 46400 1727204512.31711: Set connection var ansible_shell_executable to /bin/sh 46400 1727204512.31717: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204512.31721: Set connection var ansible_connection to ssh 46400 1727204512.31730: Set connection var ansible_pipelining to False 46400 1727204512.31740: Set connection var ansible_timeout to 10 46400 1727204512.31759: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.31762: variable 'ansible_connection' from source: unknown 46400 1727204512.31769: variable 'ansible_module_compression' from source: unknown 46400 1727204512.31772: variable 'ansible_shell_type' from source: unknown 46400 1727204512.31774: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.31777: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.31781: variable 'ansible_pipelining' from source: unknown 46400 1727204512.31783: variable 'ansible_timeout' from source: unknown 46400 1727204512.31787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.31858: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204512.31870: variable 'omit' from source: magic vars 46400 1727204512.31875: starting attempt loop 46400 1727204512.31878: running the handler 46400 1727204512.31886: handler run complete 46400 1727204512.31893: attempt loop complete, returning result 46400 1727204512.31896: _execute() done 46400 1727204512.31899: dumping result to json 46400 1727204512.31901: done dumping result, returning 46400 1727204512.31907: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-00000000002f] 46400 1727204512.31911: sending task result for task 0affcd87-79f5-1303-fda8-00000000002f ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 46400 1727204512.32039: no more pending results, returning what we have 46400 1727204512.32042: results queue empty 46400 1727204512.32043: checking for any_errors_fatal 46400 1727204512.32051: done checking for any_errors_fatal 46400 1727204512.32052: checking for max_fail_percentage 46400 1727204512.32053: done checking for max_fail_percentage 46400 1727204512.32054: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.32055: done checking to see if all hosts have failed 46400 1727204512.32056: getting the remaining hosts for this loop 46400 1727204512.32057: done getting the remaining hosts for this loop 46400 1727204512.32061: getting the next task for host managed-node2 46400 1727204512.32071: done getting next task for host managed-node2 46400 1727204512.32074: ^ task is: TASK: Fix CentOS6 Base repo 46400 1727204512.32077: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.32080: getting variables 46400 1727204512.32082: in VariableManager get_vars() 46400 1727204512.32111: Calling all_inventory to load vars for managed-node2 46400 1727204512.32113: Calling groups_inventory to load vars for managed-node2 46400 1727204512.32116: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.32126: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.32128: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.32131: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.32282: done sending task result for task 0affcd87-79f5-1303-fda8-00000000002f 46400 1727204512.32297: WORKER PROCESS EXITING 46400 1727204512.32302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.32419: done with get_vars() 46400 1727204512.32432: done getting variables 46400 1727204512.32550: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.031) 0:00:02.610 ***** 46400 1727204512.32581: entering _queue_task() for managed-node2/copy 46400 1727204512.32826: worker is 1 (out of 1 available) 46400 1727204512.32837: exiting _queue_task() for managed-node2/copy 46400 1727204512.32850: done queuing things up, now waiting for results queue to drain 46400 1727204512.32852: waiting for pending results... 46400 1727204512.33095: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 46400 1727204512.33187: in run() - task 0affcd87-79f5-1303-fda8-000000000031 46400 1727204512.33205: variable 'ansible_search_path' from source: unknown 46400 1727204512.33213: variable 'ansible_search_path' from source: unknown 46400 1727204512.33252: calling self._execute() 46400 1727204512.33332: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.33344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.33357: variable 'omit' from source: magic vars 46400 1727204512.33838: variable 'ansible_distribution' from source: facts 46400 1727204512.33876: Evaluated conditional (ansible_distribution == 'CentOS'): True 46400 1727204512.33988: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.33991: Evaluated conditional (ansible_distribution_major_version == '6'): False 46400 1727204512.33994: when evaluation is False, skipping this task 46400 1727204512.33997: _execute() done 46400 1727204512.33999: dumping result to json 46400 1727204512.34002: done dumping result, returning 46400 1727204512.34007: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [0affcd87-79f5-1303-fda8-000000000031] 46400 1727204512.34024: sending task result for task 0affcd87-79f5-1303-fda8-000000000031 46400 1727204512.34108: done sending task result for task 0affcd87-79f5-1303-fda8-000000000031 46400 1727204512.34112: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 46400 1727204512.34155: no more pending results, returning what we have 46400 1727204512.34158: results queue empty 46400 1727204512.34159: checking for any_errors_fatal 46400 1727204512.34168: done checking for any_errors_fatal 46400 1727204512.34169: checking for max_fail_percentage 46400 1727204512.34170: done checking for max_fail_percentage 46400 1727204512.34171: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.34172: done checking to see if all hosts have failed 46400 1727204512.34173: getting the remaining hosts for this loop 46400 1727204512.34174: done getting the remaining hosts for this loop 46400 1727204512.34178: getting the next task for host managed-node2 46400 1727204512.34185: done getting next task for host managed-node2 46400 1727204512.34188: ^ task is: TASK: Include the task 'enable_epel.yml' 46400 1727204512.34190: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.34193: getting variables 46400 1727204512.34194: in VariableManager get_vars() 46400 1727204512.34219: Calling all_inventory to load vars for managed-node2 46400 1727204512.34222: Calling groups_inventory to load vars for managed-node2 46400 1727204512.34225: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.34234: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.34236: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.34238: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.34353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.34474: done with get_vars() 46400 1727204512.34481: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.019) 0:00:02.630 ***** 46400 1727204512.34546: entering _queue_task() for managed-node2/include_tasks 46400 1727204512.34737: worker is 1 (out of 1 available) 46400 1727204512.34751: exiting _queue_task() for managed-node2/include_tasks 46400 1727204512.34767: done queuing things up, now waiting for results queue to drain 46400 1727204512.34769: waiting for pending results... 46400 1727204512.34912: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 46400 1727204512.34978: in run() - task 0affcd87-79f5-1303-fda8-000000000032 46400 1727204512.34989: variable 'ansible_search_path' from source: unknown 46400 1727204512.34992: variable 'ansible_search_path' from source: unknown 46400 1727204512.35018: calling self._execute() 46400 1727204512.35119: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.35123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.35131: variable 'omit' from source: magic vars 46400 1727204512.35463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204512.37597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204512.37670: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204512.37722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204512.37767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204512.37800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204512.37884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204512.37918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204512.37950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204512.38001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204512.38020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204512.38150: variable '__network_is_ostree' from source: set_fact 46400 1727204512.38158: Evaluated conditional (not __network_is_ostree | d(false)): True 46400 1727204512.38168: _execute() done 46400 1727204512.38171: dumping result to json 46400 1727204512.38174: done dumping result, returning 46400 1727204512.38180: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-1303-fda8-000000000032] 46400 1727204512.38187: sending task result for task 0affcd87-79f5-1303-fda8-000000000032 46400 1727204512.38292: done sending task result for task 0affcd87-79f5-1303-fda8-000000000032 46400 1727204512.38294: WORKER PROCESS EXITING 46400 1727204512.38319: no more pending results, returning what we have 46400 1727204512.38324: in VariableManager get_vars() 46400 1727204512.38357: Calling all_inventory to load vars for managed-node2 46400 1727204512.38376: Calling groups_inventory to load vars for managed-node2 46400 1727204512.38381: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.38391: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.38394: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.38396: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.38572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.38688: done with get_vars() 46400 1727204512.38694: variable 'ansible_search_path' from source: unknown 46400 1727204512.38695: variable 'ansible_search_path' from source: unknown 46400 1727204512.38719: we have included files to process 46400 1727204512.38719: generating all_blocks data 46400 1727204512.38721: done generating all_blocks data 46400 1727204512.38723: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 46400 1727204512.38724: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 46400 1727204512.38726: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 46400 1727204512.39212: done processing included file 46400 1727204512.39213: iterating over new_blocks loaded from include file 46400 1727204512.39214: in VariableManager get_vars() 46400 1727204512.39222: done with get_vars() 46400 1727204512.39223: filtering new block on tags 46400 1727204512.39238: done filtering new block on tags 46400 1727204512.39240: in VariableManager get_vars() 46400 1727204512.39247: done with get_vars() 46400 1727204512.39248: filtering new block on tags 46400 1727204512.39254: done filtering new block on tags 46400 1727204512.39255: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 46400 1727204512.39259: extending task lists for all hosts with included blocks 46400 1727204512.39326: done extending task lists 46400 1727204512.39327: done processing included files 46400 1727204512.39328: results queue empty 46400 1727204512.39328: checking for any_errors_fatal 46400 1727204512.39330: done checking for any_errors_fatal 46400 1727204512.39331: checking for max_fail_percentage 46400 1727204512.39331: done checking for max_fail_percentage 46400 1727204512.39332: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.39332: done checking to see if all hosts have failed 46400 1727204512.39333: getting the remaining hosts for this loop 46400 1727204512.39334: done getting the remaining hosts for this loop 46400 1727204512.39335: getting the next task for host managed-node2 46400 1727204512.39338: done getting next task for host managed-node2 46400 1727204512.39339: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 46400 1727204512.39341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.39342: getting variables 46400 1727204512.39343: in VariableManager get_vars() 46400 1727204512.39348: Calling all_inventory to load vars for managed-node2 46400 1727204512.39350: Calling groups_inventory to load vars for managed-node2 46400 1727204512.39351: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.39354: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.39360: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.39362: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.39445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.39558: done with get_vars() 46400 1727204512.39566: done getting variables 46400 1727204512.39613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 46400 1727204512.39749: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.052) 0:00:02.682 ***** 46400 1727204512.39785: entering _queue_task() for managed-node2/command 46400 1727204512.39786: Creating lock for command 46400 1727204512.40063: worker is 1 (out of 1 available) 46400 1727204512.40078: exiting _queue_task() for managed-node2/command 46400 1727204512.40090: done queuing things up, now waiting for results queue to drain 46400 1727204512.40091: waiting for pending results... 46400 1727204512.40325: running TaskExecutor() for managed-node2/TASK: Create EPEL 9 46400 1727204512.40432: in run() - task 0affcd87-79f5-1303-fda8-00000000004c 46400 1727204512.40453: variable 'ansible_search_path' from source: unknown 46400 1727204512.40460: variable 'ansible_search_path' from source: unknown 46400 1727204512.40506: calling self._execute() 46400 1727204512.40582: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.40593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.40612: variable 'omit' from source: magic vars 46400 1727204512.41034: variable 'ansible_distribution' from source: facts 46400 1727204512.41049: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 46400 1727204512.41186: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.41200: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 46400 1727204512.41207: when evaluation is False, skipping this task 46400 1727204512.41213: _execute() done 46400 1727204512.41219: dumping result to json 46400 1727204512.41225: done dumping result, returning 46400 1727204512.41234: done running TaskExecutor() for managed-node2/TASK: Create EPEL 9 [0affcd87-79f5-1303-fda8-00000000004c] 46400 1727204512.41243: sending task result for task 0affcd87-79f5-1303-fda8-00000000004c skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 46400 1727204512.41399: no more pending results, returning what we have 46400 1727204512.41403: results queue empty 46400 1727204512.41404: checking for any_errors_fatal 46400 1727204512.41406: done checking for any_errors_fatal 46400 1727204512.41406: checking for max_fail_percentage 46400 1727204512.41408: done checking for max_fail_percentage 46400 1727204512.41409: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.41410: done checking to see if all hosts have failed 46400 1727204512.41410: getting the remaining hosts for this loop 46400 1727204512.41412: done getting the remaining hosts for this loop 46400 1727204512.41417: getting the next task for host managed-node2 46400 1727204512.41424: done getting next task for host managed-node2 46400 1727204512.41427: ^ task is: TASK: Install yum-utils package 46400 1727204512.41431: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.41434: getting variables 46400 1727204512.41435: in VariableManager get_vars() 46400 1727204512.41463: Calling all_inventory to load vars for managed-node2 46400 1727204512.41467: Calling groups_inventory to load vars for managed-node2 46400 1727204512.41472: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.41484: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.41487: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.41490: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.41705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.41918: done with get_vars() 46400 1727204512.41927: done getting variables 46400 1727204512.41976: done sending task result for task 0affcd87-79f5-1303-fda8-00000000004c 46400 1727204512.42080: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 46400 1727204512.42104: WORKER PROCESS EXITING TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.023) 0:00:02.705 ***** 46400 1727204512.42118: entering _queue_task() for managed-node2/package 46400 1727204512.42120: Creating lock for package 46400 1727204512.42589: worker is 1 (out of 1 available) 46400 1727204512.42601: exiting _queue_task() for managed-node2/package 46400 1727204512.42612: done queuing things up, now waiting for results queue to drain 46400 1727204512.42613: waiting for pending results... 46400 1727204512.42862: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 46400 1727204512.42979: in run() - task 0affcd87-79f5-1303-fda8-00000000004d 46400 1727204512.43000: variable 'ansible_search_path' from source: unknown 46400 1727204512.43007: variable 'ansible_search_path' from source: unknown 46400 1727204512.43044: calling self._execute() 46400 1727204512.43123: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.43133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.43147: variable 'omit' from source: magic vars 46400 1727204512.43529: variable 'ansible_distribution' from source: facts 46400 1727204512.43548: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 46400 1727204512.43688: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.43700: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 46400 1727204512.43712: when evaluation is False, skipping this task 46400 1727204512.43718: _execute() done 46400 1727204512.43723: dumping result to json 46400 1727204512.43729: done dumping result, returning 46400 1727204512.43739: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [0affcd87-79f5-1303-fda8-00000000004d] 46400 1727204512.43752: sending task result for task 0affcd87-79f5-1303-fda8-00000000004d 46400 1727204512.43906: done sending task result for task 0affcd87-79f5-1303-fda8-00000000004d 46400 1727204512.43910: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 46400 1727204512.43951: no more pending results, returning what we have 46400 1727204512.43955: results queue empty 46400 1727204512.43956: checking for any_errors_fatal 46400 1727204512.43967: done checking for any_errors_fatal 46400 1727204512.43968: checking for max_fail_percentage 46400 1727204512.43970: done checking for max_fail_percentage 46400 1727204512.43970: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.43971: done checking to see if all hosts have failed 46400 1727204512.43972: getting the remaining hosts for this loop 46400 1727204512.43973: done getting the remaining hosts for this loop 46400 1727204512.43977: getting the next task for host managed-node2 46400 1727204512.43987: done getting next task for host managed-node2 46400 1727204512.43989: ^ task is: TASK: Enable EPEL 7 46400 1727204512.43993: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.43996: getting variables 46400 1727204512.43997: in VariableManager get_vars() 46400 1727204512.44034: Calling all_inventory to load vars for managed-node2 46400 1727204512.44037: Calling groups_inventory to load vars for managed-node2 46400 1727204512.44041: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.44051: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.44053: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.44055: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.44173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.44288: done with get_vars() 46400 1727204512.44295: done getting variables 46400 1727204512.44335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.022) 0:00:02.728 ***** 46400 1727204512.44356: entering _queue_task() for managed-node2/command 46400 1727204512.44537: worker is 1 (out of 1 available) 46400 1727204512.44551: exiting _queue_task() for managed-node2/command 46400 1727204512.44562: done queuing things up, now waiting for results queue to drain 46400 1727204512.44565: waiting for pending results... 46400 1727204512.44715: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 46400 1727204512.44780: in run() - task 0affcd87-79f5-1303-fda8-00000000004e 46400 1727204512.44789: variable 'ansible_search_path' from source: unknown 46400 1727204512.44797: variable 'ansible_search_path' from source: unknown 46400 1727204512.44824: calling self._execute() 46400 1727204512.44926: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.44930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.44943: variable 'omit' from source: magic vars 46400 1727204512.45197: variable 'ansible_distribution' from source: facts 46400 1727204512.45205: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 46400 1727204512.45300: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.45304: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 46400 1727204512.45307: when evaluation is False, skipping this task 46400 1727204512.45310: _execute() done 46400 1727204512.45312: dumping result to json 46400 1727204512.45314: done dumping result, returning 46400 1727204512.45320: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [0affcd87-79f5-1303-fda8-00000000004e] 46400 1727204512.45325: sending task result for task 0affcd87-79f5-1303-fda8-00000000004e 46400 1727204512.45404: done sending task result for task 0affcd87-79f5-1303-fda8-00000000004e 46400 1727204512.45406: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 46400 1727204512.45525: no more pending results, returning what we have 46400 1727204512.45527: results queue empty 46400 1727204512.45528: checking for any_errors_fatal 46400 1727204512.45530: done checking for any_errors_fatal 46400 1727204512.45530: checking for max_fail_percentage 46400 1727204512.45531: done checking for max_fail_percentage 46400 1727204512.45532: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.45532: done checking to see if all hosts have failed 46400 1727204512.45533: getting the remaining hosts for this loop 46400 1727204512.45534: done getting the remaining hosts for this loop 46400 1727204512.45536: getting the next task for host managed-node2 46400 1727204512.45540: done getting next task for host managed-node2 46400 1727204512.45541: ^ task is: TASK: Enable EPEL 8 46400 1727204512.45544: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.45546: getting variables 46400 1727204512.45547: in VariableManager get_vars() 46400 1727204512.45563: Calling all_inventory to load vars for managed-node2 46400 1727204512.45566: Calling groups_inventory to load vars for managed-node2 46400 1727204512.45568: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.45575: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.45576: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.45578: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.45753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.45942: done with get_vars() 46400 1727204512.45957: done getting variables 46400 1727204512.46017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.016) 0:00:02.745 ***** 46400 1727204512.46049: entering _queue_task() for managed-node2/command 46400 1727204512.46286: worker is 1 (out of 1 available) 46400 1727204512.46298: exiting _queue_task() for managed-node2/command 46400 1727204512.46310: done queuing things up, now waiting for results queue to drain 46400 1727204512.46312: waiting for pending results... 46400 1727204512.46569: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 46400 1727204512.46680: in run() - task 0affcd87-79f5-1303-fda8-00000000004f 46400 1727204512.46698: variable 'ansible_search_path' from source: unknown 46400 1727204512.46707: variable 'ansible_search_path' from source: unknown 46400 1727204512.46755: calling self._execute() 46400 1727204512.46843: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.46857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.46880: variable 'omit' from source: magic vars 46400 1727204512.47192: variable 'ansible_distribution' from source: facts 46400 1727204512.47202: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 46400 1727204512.47291: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.47295: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 46400 1727204512.47299: when evaluation is False, skipping this task 46400 1727204512.47304: _execute() done 46400 1727204512.47306: dumping result to json 46400 1727204512.47308: done dumping result, returning 46400 1727204512.47314: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [0affcd87-79f5-1303-fda8-00000000004f] 46400 1727204512.47321: sending task result for task 0affcd87-79f5-1303-fda8-00000000004f 46400 1727204512.47402: done sending task result for task 0affcd87-79f5-1303-fda8-00000000004f 46400 1727204512.47404: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 46400 1727204512.47459: no more pending results, returning what we have 46400 1727204512.47463: results queue empty 46400 1727204512.47465: checking for any_errors_fatal 46400 1727204512.47471: done checking for any_errors_fatal 46400 1727204512.47472: checking for max_fail_percentage 46400 1727204512.47473: done checking for max_fail_percentage 46400 1727204512.47474: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.47475: done checking to see if all hosts have failed 46400 1727204512.47475: getting the remaining hosts for this loop 46400 1727204512.47477: done getting the remaining hosts for this loop 46400 1727204512.47480: getting the next task for host managed-node2 46400 1727204512.47488: done getting next task for host managed-node2 46400 1727204512.47491: ^ task is: TASK: Enable EPEL 6 46400 1727204512.47494: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.47497: getting variables 46400 1727204512.47498: in VariableManager get_vars() 46400 1727204512.47523: Calling all_inventory to load vars for managed-node2 46400 1727204512.47525: Calling groups_inventory to load vars for managed-node2 46400 1727204512.47528: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.47537: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.47539: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.47541: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.47649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.47783: done with get_vars() 46400 1727204512.47789: done getting variables 46400 1727204512.47832: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.018) 0:00:02.763 ***** 46400 1727204512.47852: entering _queue_task() for managed-node2/copy 46400 1727204512.48030: worker is 1 (out of 1 available) 46400 1727204512.48043: exiting _queue_task() for managed-node2/copy 46400 1727204512.48056: done queuing things up, now waiting for results queue to drain 46400 1727204512.48057: waiting for pending results... 46400 1727204512.48207: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 46400 1727204512.48274: in run() - task 0affcd87-79f5-1303-fda8-000000000051 46400 1727204512.48280: variable 'ansible_search_path' from source: unknown 46400 1727204512.48287: variable 'ansible_search_path' from source: unknown 46400 1727204512.48313: calling self._execute() 46400 1727204512.48369: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.48374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.48386: variable 'omit' from source: magic vars 46400 1727204512.48737: variable 'ansible_distribution' from source: facts 46400 1727204512.48747: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 46400 1727204512.48828: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.48834: Evaluated conditional (ansible_distribution_major_version == '6'): False 46400 1727204512.48837: when evaluation is False, skipping this task 46400 1727204512.48840: _execute() done 46400 1727204512.48843: dumping result to json 46400 1727204512.48845: done dumping result, returning 46400 1727204512.48851: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [0affcd87-79f5-1303-fda8-000000000051] 46400 1727204512.48856: sending task result for task 0affcd87-79f5-1303-fda8-000000000051 46400 1727204512.48942: done sending task result for task 0affcd87-79f5-1303-fda8-000000000051 46400 1727204512.48945: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 46400 1727204512.48997: no more pending results, returning what we have 46400 1727204512.49000: results queue empty 46400 1727204512.49001: checking for any_errors_fatal 46400 1727204512.49006: done checking for any_errors_fatal 46400 1727204512.49007: checking for max_fail_percentage 46400 1727204512.49008: done checking for max_fail_percentage 46400 1727204512.49009: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.49009: done checking to see if all hosts have failed 46400 1727204512.49010: getting the remaining hosts for this loop 46400 1727204512.49011: done getting the remaining hosts for this loop 46400 1727204512.49014: getting the next task for host managed-node2 46400 1727204512.49022: done getting next task for host managed-node2 46400 1727204512.49024: ^ task is: TASK: Set network provider to 'nm' 46400 1727204512.49026: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.49029: getting variables 46400 1727204512.49030: in VariableManager get_vars() 46400 1727204512.49057: Calling all_inventory to load vars for managed-node2 46400 1727204512.49061: Calling groups_inventory to load vars for managed-node2 46400 1727204512.49066: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.49074: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.49075: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.49077: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.49179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.49291: done with get_vars() 46400 1727204512.49298: done getting variables 46400 1727204512.49338: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.015) 0:00:02.778 ***** 46400 1727204512.49357: entering _queue_task() for managed-node2/set_fact 46400 1727204512.49530: worker is 1 (out of 1 available) 46400 1727204512.49543: exiting _queue_task() for managed-node2/set_fact 46400 1727204512.49555: done queuing things up, now waiting for results queue to drain 46400 1727204512.49557: waiting for pending results... 46400 1727204512.49712: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 46400 1727204512.49765: in run() - task 0affcd87-79f5-1303-fda8-000000000007 46400 1727204512.49778: variable 'ansible_search_path' from source: unknown 46400 1727204512.49805: calling self._execute() 46400 1727204512.49857: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.49860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.49872: variable 'omit' from source: magic vars 46400 1727204512.49945: variable 'omit' from source: magic vars 46400 1727204512.49972: variable 'omit' from source: magic vars 46400 1727204512.49998: variable 'omit' from source: magic vars 46400 1727204512.50032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204512.50063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204512.50088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204512.50101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.50112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.50136: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204512.50139: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.50157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.50254: Set connection var ansible_shell_type to sh 46400 1727204512.50279: Set connection var ansible_shell_executable to /bin/sh 46400 1727204512.50294: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204512.50305: Set connection var ansible_connection to ssh 46400 1727204512.50314: Set connection var ansible_pipelining to False 46400 1727204512.50322: Set connection var ansible_timeout to 10 46400 1727204512.50348: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.50356: variable 'ansible_connection' from source: unknown 46400 1727204512.50362: variable 'ansible_module_compression' from source: unknown 46400 1727204512.50375: variable 'ansible_shell_type' from source: unknown 46400 1727204512.50387: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.50398: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.50405: variable 'ansible_pipelining' from source: unknown 46400 1727204512.50411: variable 'ansible_timeout' from source: unknown 46400 1727204512.50417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.50566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204512.50582: variable 'omit' from source: magic vars 46400 1727204512.50597: starting attempt loop 46400 1727204512.50607: running the handler 46400 1727204512.50625: handler run complete 46400 1727204512.50639: attempt loop complete, returning result 46400 1727204512.50646: _execute() done 46400 1727204512.50652: dumping result to json 46400 1727204512.50658: done dumping result, returning 46400 1727204512.50669: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [0affcd87-79f5-1303-fda8-000000000007] 46400 1727204512.50677: sending task result for task 0affcd87-79f5-1303-fda8-000000000007 46400 1727204512.50780: done sending task result for task 0affcd87-79f5-1303-fda8-000000000007 46400 1727204512.50787: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 46400 1727204512.51016: no more pending results, returning what we have 46400 1727204512.51024: results queue empty 46400 1727204512.51025: checking for any_errors_fatal 46400 1727204512.51029: done checking for any_errors_fatal 46400 1727204512.51030: checking for max_fail_percentage 46400 1727204512.51032: done checking for max_fail_percentage 46400 1727204512.51033: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.51034: done checking to see if all hosts have failed 46400 1727204512.51034: getting the remaining hosts for this loop 46400 1727204512.51036: done getting the remaining hosts for this loop 46400 1727204512.51039: getting the next task for host managed-node2 46400 1727204512.51044: done getting next task for host managed-node2 46400 1727204512.51046: ^ task is: TASK: meta (flush_handlers) 46400 1727204512.51048: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.51051: getting variables 46400 1727204512.51053: in VariableManager get_vars() 46400 1727204512.51081: Calling all_inventory to load vars for managed-node2 46400 1727204512.51084: Calling groups_inventory to load vars for managed-node2 46400 1727204512.51087: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.51099: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.51102: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.51105: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.51335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.51537: done with get_vars() 46400 1727204512.51546: done getting variables 46400 1727204512.51614: in VariableManager get_vars() 46400 1727204512.51623: Calling all_inventory to load vars for managed-node2 46400 1727204512.51626: Calling groups_inventory to load vars for managed-node2 46400 1727204512.51628: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.51632: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.51635: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.51644: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.51782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.51972: done with get_vars() 46400 1727204512.51988: done queuing things up, now waiting for results queue to drain 46400 1727204512.51990: results queue empty 46400 1727204512.51991: checking for any_errors_fatal 46400 1727204512.51993: done checking for any_errors_fatal 46400 1727204512.51994: checking for max_fail_percentage 46400 1727204512.51995: done checking for max_fail_percentage 46400 1727204512.51995: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.51996: done checking to see if all hosts have failed 46400 1727204512.51997: getting the remaining hosts for this loop 46400 1727204512.51998: done getting the remaining hosts for this loop 46400 1727204512.52004: getting the next task for host managed-node2 46400 1727204512.52007: done getting next task for host managed-node2 46400 1727204512.52009: ^ task is: TASK: meta (flush_handlers) 46400 1727204512.52010: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.52018: getting variables 46400 1727204512.52019: in VariableManager get_vars() 46400 1727204512.52027: Calling all_inventory to load vars for managed-node2 46400 1727204512.52029: Calling groups_inventory to load vars for managed-node2 46400 1727204512.52031: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.52035: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.52038: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.52042: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.52171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.52293: done with get_vars() 46400 1727204512.52300: done getting variables 46400 1727204512.52337: in VariableManager get_vars() 46400 1727204512.52344: Calling all_inventory to load vars for managed-node2 46400 1727204512.52345: Calling groups_inventory to load vars for managed-node2 46400 1727204512.52347: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.52349: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.52351: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.52352: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.52453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.52566: done with get_vars() 46400 1727204512.52574: done queuing things up, now waiting for results queue to drain 46400 1727204512.52575: results queue empty 46400 1727204512.52576: checking for any_errors_fatal 46400 1727204512.52577: done checking for any_errors_fatal 46400 1727204512.52577: checking for max_fail_percentage 46400 1727204512.52578: done checking for max_fail_percentage 46400 1727204512.52578: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.52579: done checking to see if all hosts have failed 46400 1727204512.52579: getting the remaining hosts for this loop 46400 1727204512.52580: done getting the remaining hosts for this loop 46400 1727204512.52581: getting the next task for host managed-node2 46400 1727204512.52583: done getting next task for host managed-node2 46400 1727204512.52584: ^ task is: None 46400 1727204512.52585: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.52585: done queuing things up, now waiting for results queue to drain 46400 1727204512.52586: results queue empty 46400 1727204512.52586: checking for any_errors_fatal 46400 1727204512.52587: done checking for any_errors_fatal 46400 1727204512.52587: checking for max_fail_percentage 46400 1727204512.52588: done checking for max_fail_percentage 46400 1727204512.52588: checking to see if all hosts have failed and the running result is not ok 46400 1727204512.52589: done checking to see if all hosts have failed 46400 1727204512.52590: getting the next task for host managed-node2 46400 1727204512.52591: done getting next task for host managed-node2 46400 1727204512.52592: ^ task is: None 46400 1727204512.52593: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.52630: in VariableManager get_vars() 46400 1727204512.52642: done with get_vars() 46400 1727204512.52646: in VariableManager get_vars() 46400 1727204512.52652: done with get_vars() 46400 1727204512.52656: variable 'omit' from source: magic vars 46400 1727204512.52680: in VariableManager get_vars() 46400 1727204512.52687: done with get_vars() 46400 1727204512.52701: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 46400 1727204512.52893: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 46400 1727204512.52916: getting the remaining hosts for this loop 46400 1727204512.52917: done getting the remaining hosts for this loop 46400 1727204512.52918: getting the next task for host managed-node2 46400 1727204512.52920: done getting next task for host managed-node2 46400 1727204512.52921: ^ task is: TASK: Gathering Facts 46400 1727204512.52922: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204512.52924: getting variables 46400 1727204512.52924: in VariableManager get_vars() 46400 1727204512.52930: Calling all_inventory to load vars for managed-node2 46400 1727204512.52931: Calling groups_inventory to load vars for managed-node2 46400 1727204512.52932: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204512.52935: Calling all_plugins_play to load vars for managed-node2 46400 1727204512.52945: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204512.52947: Calling groups_plugins_play to load vars for managed-node2 46400 1727204512.53049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204512.53154: done with get_vars() 46400 1727204512.53160: done getting variables 46400 1727204512.53191: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.038) 0:00:02.816 ***** 46400 1727204512.53208: entering _queue_task() for managed-node2/gather_facts 46400 1727204512.53397: worker is 1 (out of 1 available) 46400 1727204512.53409: exiting _queue_task() for managed-node2/gather_facts 46400 1727204512.53421: done queuing things up, now waiting for results queue to drain 46400 1727204512.53423: waiting for pending results... 46400 1727204512.53575: running TaskExecutor() for managed-node2/TASK: Gathering Facts 46400 1727204512.53635: in run() - task 0affcd87-79f5-1303-fda8-000000000077 46400 1727204512.53647: variable 'ansible_search_path' from source: unknown 46400 1727204512.53677: calling self._execute() 46400 1727204512.53730: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.53734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.53739: variable 'omit' from source: magic vars 46400 1727204512.54033: variable 'ansible_distribution_major_version' from source: facts 46400 1727204512.54056: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204512.54070: variable 'omit' from source: magic vars 46400 1727204512.54102: variable 'omit' from source: magic vars 46400 1727204512.54145: variable 'omit' from source: magic vars 46400 1727204512.54197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204512.54236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204512.54271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204512.54294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.54312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204512.54343: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204512.54358: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.54373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.54486: Set connection var ansible_shell_type to sh 46400 1727204512.54501: Set connection var ansible_shell_executable to /bin/sh 46400 1727204512.54510: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204512.54524: Set connection var ansible_connection to ssh 46400 1727204512.54532: Set connection var ansible_pipelining to False 46400 1727204512.54541: Set connection var ansible_timeout to 10 46400 1727204512.54571: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.54583: variable 'ansible_connection' from source: unknown 46400 1727204512.54593: variable 'ansible_module_compression' from source: unknown 46400 1727204512.54600: variable 'ansible_shell_type' from source: unknown 46400 1727204512.54606: variable 'ansible_shell_executable' from source: unknown 46400 1727204512.54612: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204512.54619: variable 'ansible_pipelining' from source: unknown 46400 1727204512.54629: variable 'ansible_timeout' from source: unknown 46400 1727204512.54637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204512.54840: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204512.54858: variable 'omit' from source: magic vars 46400 1727204512.54870: starting attempt loop 46400 1727204512.54877: running the handler 46400 1727204512.54900: variable 'ansible_facts' from source: unknown 46400 1727204512.54929: _low_level_execute_command(): starting 46400 1727204512.54941: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204512.55758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.55782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.55797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.55819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.55867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.55883: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.55903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.55922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.55938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.55948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.55960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.55978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.55997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.56011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.56022: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.56034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.56123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.56146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.56167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.56258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.58485: stdout chunk (state=3): >>>/root <<< 46400 1727204512.58637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204512.58733: stderr chunk (state=3): >>><<< 46400 1727204512.58755: stdout chunk (state=3): >>><<< 46400 1727204512.58869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204512.58873: _low_level_execute_command(): starting 46400 1727204512.58876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695 `" && echo ansible-tmp-1727204512.5879068-46606-209494570949695="` echo /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695 `" ) && sleep 0' 46400 1727204512.59514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.59526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.59542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.59567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.59610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.59621: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.59633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.59652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.59672: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.59686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.59698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.59710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.59724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.59734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.59744: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.59755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.59842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.59863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.59888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.59966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.62584: stdout chunk (state=3): >>>ansible-tmp-1727204512.5879068-46606-209494570949695=/root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695 <<< 46400 1727204512.62787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204512.62906: stderr chunk (state=3): >>><<< 46400 1727204512.62909: stdout chunk (state=3): >>><<< 46400 1727204512.63211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204512.5879068-46606-209494570949695=/root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204512.63214: variable 'ansible_module_compression' from source: unknown 46400 1727204512.63217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 46400 1727204512.63219: variable 'ansible_facts' from source: unknown 46400 1727204512.63302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/AnsiballZ_setup.py 46400 1727204512.63493: Sending initial data 46400 1727204512.63497: Sent initial data (154 bytes) 46400 1727204512.64553: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.64569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.64583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.64598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.64646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.64657: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.64676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.64694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.64703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.64712: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.64722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.64743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.64757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.64771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.64782: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.64794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.64882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.64904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.64918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.64996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.67391: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204512.67429: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204512.67480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmph8bb1egh /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/AnsiballZ_setup.py <<< 46400 1727204512.67517: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204512.70077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204512.70371: stderr chunk (state=3): >>><<< 46400 1727204512.70375: stdout chunk (state=3): >>><<< 46400 1727204512.70377: done transferring module to remote 46400 1727204512.70383: _low_level_execute_command(): starting 46400 1727204512.70385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/ /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/AnsiballZ_setup.py && sleep 0' 46400 1727204512.70997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.71012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.71026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.71052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.71100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.71112: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.71126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.71142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.71166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.71180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.71193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.71206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.71221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.71232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.71243: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.71255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.71333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.71355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.71383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.71468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204512.73986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204512.74050: stderr chunk (state=3): >>><<< 46400 1727204512.74054: stdout chunk (state=3): >>><<< 46400 1727204512.74177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204512.74181: _low_level_execute_command(): starting 46400 1727204512.74184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/AnsiballZ_setup.py && sleep 0' 46400 1727204512.74875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204512.74890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.74906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.74925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.74987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.74999: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204512.75013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.75030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204512.75053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204512.75075: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204512.75089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204512.75102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204512.75118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204512.75130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204512.75142: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204512.75170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204512.75247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204512.75284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204512.75303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204512.75401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204513.43305: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "53", "epoch": "1727204513", "epoch_int": "1727204513", "date": "2024-09-24", "time": "15:01:53", "iso8601_micro": "2024-09-24T19:01:53.128535Z", "iso8601": "2024-09-24T19:01:53Z", "iso8601_basic": "20240924T150153128535", "iso8601_basic_short": "20240924T150153", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0<<< 46400 1727204513.43377: stdout chunk (state=3): >>>,115200n8"]}, "ansible_loadavg": {"1m": 0.79, "5m": 0.62, "15m": 0.36}, "ansible_lsb": {}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2768, "ansible_swaptotal_mb": 0, "ansible_swapfree_<<< 46400 1727204513.43411: stdout chunk (state=3): >>>mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 764, "free": 2768}, "nocache": {"free": 3245, "used": 287}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 876, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266690560, "block_size": 4096, "block_total": 65519355, "block_available": 64518235, "block_used": 1001120, "inode_total": 131071472, "inode_available": 130998221, "inode_used": 73251, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 46400 1727204513.45971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204513.45975: stdout chunk (state=3): >>><<< 46400 1727204513.45992: stderr chunk (state=3): >>><<< 46400 1727204513.46277: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "53", "epoch": "1727204513", "epoch_int": "1727204513", "date": "2024-09-24", "time": "15:01:53", "iso8601_micro": "2024-09-24T19:01:53.128535Z", "iso8601": "2024-09-24T19:01:53Z", "iso8601_basic": "20240924T150153128535", "iso8601_basic_short": "20240924T150153", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.79, "5m": 0.62, "15m": 0.36}, "ansible_lsb": {}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2768, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 764, "free": 2768}, "nocache": {"free": 3245, "used": 287}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 876, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264266690560, "block_size": 4096, "block_total": 65519355, "block_available": 64518235, "block_used": 1001120, "inode_total": 131071472, "inode_available": 130998221, "inode_used": 73251, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204513.46409: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204513.46432: _low_level_execute_command(): starting 46400 1727204513.46441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204512.5879068-46606-209494570949695/ > /dev/null 2>&1 && sleep 0' 46400 1727204513.47380: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204513.47396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.47413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.47447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.47498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.47512: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204513.47527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.47557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204513.47576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204513.47589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204513.47602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.47616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.47632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.47657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.47676: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204513.47691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.47778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204513.47801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204513.47819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204513.47915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204513.50437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204513.50535: stderr chunk (state=3): >>><<< 46400 1727204513.50552: stdout chunk (state=3): >>><<< 46400 1727204513.50678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204513.50682: handler run complete 46400 1727204513.50798: variable 'ansible_facts' from source: unknown 46400 1727204513.50846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.51194: variable 'ansible_facts' from source: unknown 46400 1727204513.51295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.51448: attempt loop complete, returning result 46400 1727204513.51463: _execute() done 46400 1727204513.51473: dumping result to json 46400 1727204513.51506: done dumping result, returning 46400 1727204513.51517: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-1303-fda8-000000000077] 46400 1727204513.51525: sending task result for task 0affcd87-79f5-1303-fda8-000000000077 ok: [managed-node2] 46400 1727204513.52349: no more pending results, returning what we have 46400 1727204513.52352: results queue empty 46400 1727204513.52353: checking for any_errors_fatal 46400 1727204513.52355: done checking for any_errors_fatal 46400 1727204513.52356: checking for max_fail_percentage 46400 1727204513.52357: done checking for max_fail_percentage 46400 1727204513.52358: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.52359: done checking to see if all hosts have failed 46400 1727204513.52363: getting the remaining hosts for this loop 46400 1727204513.52366: done getting the remaining hosts for this loop 46400 1727204513.52371: getting the next task for host managed-node2 46400 1727204513.52385: done getting next task for host managed-node2 46400 1727204513.52388: ^ task is: TASK: meta (flush_handlers) 46400 1727204513.52389: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.52393: getting variables 46400 1727204513.52394: in VariableManager get_vars() 46400 1727204513.52419: Calling all_inventory to load vars for managed-node2 46400 1727204513.52422: Calling groups_inventory to load vars for managed-node2 46400 1727204513.52425: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.52436: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.52440: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.52443: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.52618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.52885: done with get_vars() 46400 1727204513.52895: done getting variables 46400 1727204513.53044: done sending task result for task 0affcd87-79f5-1303-fda8-000000000077 46400 1727204513.53048: WORKER PROCESS EXITING 46400 1727204513.53100: in VariableManager get_vars() 46400 1727204513.53110: Calling all_inventory to load vars for managed-node2 46400 1727204513.53112: Calling groups_inventory to load vars for managed-node2 46400 1727204513.53114: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.53119: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.53121: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.53129: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.53451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.53651: done with get_vars() 46400 1727204513.53669: done queuing things up, now waiting for results queue to drain 46400 1727204513.53671: results queue empty 46400 1727204513.53672: checking for any_errors_fatal 46400 1727204513.53675: done checking for any_errors_fatal 46400 1727204513.53676: checking for max_fail_percentage 46400 1727204513.53677: done checking for max_fail_percentage 46400 1727204513.53678: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.53678: done checking to see if all hosts have failed 46400 1727204513.53679: getting the remaining hosts for this loop 46400 1727204513.53680: done getting the remaining hosts for this loop 46400 1727204513.53687: getting the next task for host managed-node2 46400 1727204513.53691: done getting next task for host managed-node2 46400 1727204513.53693: ^ task is: TASK: Show playbook name 46400 1727204513.53698: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.53700: getting variables 46400 1727204513.53701: in VariableManager get_vars() 46400 1727204513.53710: Calling all_inventory to load vars for managed-node2 46400 1727204513.53712: Calling groups_inventory to load vars for managed-node2 46400 1727204513.53714: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.53719: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.53721: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.53724: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.53859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.54062: done with get_vars() 46400 1727204513.54071: done getting variables 46400 1727204513.54153: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Tuesday 24 September 2024 15:01:53 -0400 (0:00:01.009) 0:00:03.826 ***** 46400 1727204513.54184: entering _queue_task() for managed-node2/debug 46400 1727204513.54186: Creating lock for debug 46400 1727204513.54496: worker is 1 (out of 1 available) 46400 1727204513.54508: exiting _queue_task() for managed-node2/debug 46400 1727204513.54520: done queuing things up, now waiting for results queue to drain 46400 1727204513.54522: waiting for pending results... 46400 1727204513.54783: running TaskExecutor() for managed-node2/TASK: Show playbook name 46400 1727204513.54886: in run() - task 0affcd87-79f5-1303-fda8-00000000000b 46400 1727204513.54910: variable 'ansible_search_path' from source: unknown 46400 1727204513.54951: calling self._execute() 46400 1727204513.55042: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.55054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.55078: variable 'omit' from source: magic vars 46400 1727204513.55568: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.55587: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.55599: variable 'omit' from source: magic vars 46400 1727204513.55636: variable 'omit' from source: magic vars 46400 1727204513.55689: variable 'omit' from source: magic vars 46400 1727204513.55739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204513.55788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.55815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204513.55842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.55858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.55900: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.55908: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.55914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.56013: Set connection var ansible_shell_type to sh 46400 1727204513.56026: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.56034: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.56045: Set connection var ansible_connection to ssh 46400 1727204513.56055: Set connection var ansible_pipelining to False 46400 1727204513.56067: Set connection var ansible_timeout to 10 46400 1727204513.56099: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.56106: variable 'ansible_connection' from source: unknown 46400 1727204513.56111: variable 'ansible_module_compression' from source: unknown 46400 1727204513.56117: variable 'ansible_shell_type' from source: unknown 46400 1727204513.56121: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.56127: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.56132: variable 'ansible_pipelining' from source: unknown 46400 1727204513.56137: variable 'ansible_timeout' from source: unknown 46400 1727204513.56143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.56290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.56305: variable 'omit' from source: magic vars 46400 1727204513.56320: starting attempt loop 46400 1727204513.56327: running the handler 46400 1727204513.56373: handler run complete 46400 1727204513.56400: attempt loop complete, returning result 46400 1727204513.56407: _execute() done 46400 1727204513.56413: dumping result to json 46400 1727204513.56426: done dumping result, returning 46400 1727204513.56435: done running TaskExecutor() for managed-node2/TASK: Show playbook name [0affcd87-79f5-1303-fda8-00000000000b] 46400 1727204513.56445: sending task result for task 0affcd87-79f5-1303-fda8-00000000000b ok: [managed-node2] => {} MSG: this is: playbooks/tests_states.yml 46400 1727204513.56581: no more pending results, returning what we have 46400 1727204513.56584: results queue empty 46400 1727204513.56585: checking for any_errors_fatal 46400 1727204513.56586: done checking for any_errors_fatal 46400 1727204513.56587: checking for max_fail_percentage 46400 1727204513.56588: done checking for max_fail_percentage 46400 1727204513.56589: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.56590: done checking to see if all hosts have failed 46400 1727204513.56591: getting the remaining hosts for this loop 46400 1727204513.56592: done getting the remaining hosts for this loop 46400 1727204513.56596: getting the next task for host managed-node2 46400 1727204513.56603: done getting next task for host managed-node2 46400 1727204513.56606: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204513.56607: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.56610: getting variables 46400 1727204513.56612: in VariableManager get_vars() 46400 1727204513.56637: Calling all_inventory to load vars for managed-node2 46400 1727204513.56640: Calling groups_inventory to load vars for managed-node2 46400 1727204513.56644: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.56653: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.56655: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.56657: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.56893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.57097: done with get_vars() 46400 1727204513.57106: done getting variables 46400 1727204513.57211: done sending task result for task 0affcd87-79f5-1303-fda8-00000000000b 46400 1727204513.57215: WORKER PROCESS EXITING TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.030) 0:00:03.857 ***** 46400 1727204513.57249: entering _queue_task() for managed-node2/include_tasks 46400 1727204513.57769: worker is 1 (out of 1 available) 46400 1727204513.57781: exiting _queue_task() for managed-node2/include_tasks 46400 1727204513.57792: done queuing things up, now waiting for results queue to drain 46400 1727204513.57794: waiting for pending results... 46400 1727204513.58048: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204513.58152: in run() - task 0affcd87-79f5-1303-fda8-00000000000d 46400 1727204513.58176: variable 'ansible_search_path' from source: unknown 46400 1727204513.58221: calling self._execute() 46400 1727204513.58314: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.58327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.58342: variable 'omit' from source: magic vars 46400 1727204513.58750: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.58773: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.58786: _execute() done 46400 1727204513.58797: dumping result to json 46400 1727204513.58804: done dumping result, returning 46400 1727204513.58814: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-00000000000d] 46400 1727204513.58825: sending task result for task 0affcd87-79f5-1303-fda8-00000000000d 46400 1727204513.58986: no more pending results, returning what we have 46400 1727204513.58991: in VariableManager get_vars() 46400 1727204513.59024: Calling all_inventory to load vars for managed-node2 46400 1727204513.59027: Calling groups_inventory to load vars for managed-node2 46400 1727204513.59031: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.59044: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.59047: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.59051: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.59253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.59465: done with get_vars() 46400 1727204513.59473: variable 'ansible_search_path' from source: unknown 46400 1727204513.59488: we have included files to process 46400 1727204513.59489: generating all_blocks data 46400 1727204513.59491: done generating all_blocks data 46400 1727204513.59492: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204513.59493: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204513.59496: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204513.60133: done sending task result for task 0affcd87-79f5-1303-fda8-00000000000d 46400 1727204513.60137: WORKER PROCESS EXITING 46400 1727204513.60416: in VariableManager get_vars() 46400 1727204513.60434: done with get_vars() 46400 1727204513.60481: in VariableManager get_vars() 46400 1727204513.60506: done with get_vars() 46400 1727204513.60548: in VariableManager get_vars() 46400 1727204513.60569: done with get_vars() 46400 1727204513.60633: in VariableManager get_vars() 46400 1727204513.60648: done with get_vars() 46400 1727204513.60694: in VariableManager get_vars() 46400 1727204513.60718: done with get_vars() 46400 1727204513.61119: in VariableManager get_vars() 46400 1727204513.61135: done with get_vars() 46400 1727204513.61156: done processing included file 46400 1727204513.61158: iterating over new_blocks loaded from include file 46400 1727204513.61162: in VariableManager get_vars() 46400 1727204513.61176: done with get_vars() 46400 1727204513.61177: filtering new block on tags 46400 1727204513.61303: done filtering new block on tags 46400 1727204513.61306: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204513.61311: extending task lists for all hosts with included blocks 46400 1727204513.61347: done extending task lists 46400 1727204513.61349: done processing included files 46400 1727204513.61349: results queue empty 46400 1727204513.61350: checking for any_errors_fatal 46400 1727204513.61358: done checking for any_errors_fatal 46400 1727204513.61362: checking for max_fail_percentage 46400 1727204513.61363: done checking for max_fail_percentage 46400 1727204513.61364: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.61369: done checking to see if all hosts have failed 46400 1727204513.61370: getting the remaining hosts for this loop 46400 1727204513.61372: done getting the remaining hosts for this loop 46400 1727204513.61374: getting the next task for host managed-node2 46400 1727204513.61379: done getting next task for host managed-node2 46400 1727204513.61381: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204513.61384: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.61386: getting variables 46400 1727204513.61386: in VariableManager get_vars() 46400 1727204513.61394: Calling all_inventory to load vars for managed-node2 46400 1727204513.61397: Calling groups_inventory to load vars for managed-node2 46400 1727204513.61399: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.61405: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.61407: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.61411: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.61595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.61800: done with get_vars() 46400 1727204513.61812: done getting variables 46400 1727204513.61850: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204513.61981: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.047) 0:00:03.904 ***** 46400 1727204513.62028: entering _queue_task() for managed-node2/debug 46400 1727204513.62339: worker is 1 (out of 1 available) 46400 1727204513.62355: exiting _queue_task() for managed-node2/debug 46400 1727204513.62371: done queuing things up, now waiting for results queue to drain 46400 1727204513.62373: waiting for pending results... 46400 1727204513.62638: running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile 46400 1727204513.62751: in run() - task 0affcd87-79f5-1303-fda8-000000000091 46400 1727204513.62773: variable 'ansible_search_path' from source: unknown 46400 1727204513.62788: variable 'ansible_search_path' from source: unknown 46400 1727204513.62834: calling self._execute() 46400 1727204513.62928: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.62940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.62953: variable 'omit' from source: magic vars 46400 1727204513.63351: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.63375: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.63386: variable 'omit' from source: magic vars 46400 1727204513.63426: variable 'omit' from source: magic vars 46400 1727204513.63552: variable 'lsr_description' from source: include params 46400 1727204513.63585: variable 'omit' from source: magic vars 46400 1727204513.63630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204513.63687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.63716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204513.63739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.63753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.63799: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.63811: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.63819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.63930: Set connection var ansible_shell_type to sh 46400 1727204513.63944: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.63954: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.63968: Set connection var ansible_connection to ssh 46400 1727204513.63984: Set connection var ansible_pipelining to False 46400 1727204513.64000: Set connection var ansible_timeout to 10 46400 1727204513.64034: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.64043: variable 'ansible_connection' from source: unknown 46400 1727204513.64051: variable 'ansible_module_compression' from source: unknown 46400 1727204513.64057: variable 'ansible_shell_type' from source: unknown 46400 1727204513.64070: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.64077: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.64090: variable 'ansible_pipelining' from source: unknown 46400 1727204513.64102: variable 'ansible_timeout' from source: unknown 46400 1727204513.64110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.64273: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.64288: variable 'omit' from source: magic vars 46400 1727204513.64296: starting attempt loop 46400 1727204513.64307: running the handler 46400 1727204513.64357: handler run complete 46400 1727204513.64379: attempt loop complete, returning result 46400 1727204513.64385: _execute() done 46400 1727204513.64391: dumping result to json 46400 1727204513.64398: done dumping result, returning 46400 1727204513.64406: done running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile [0affcd87-79f5-1303-fda8-000000000091] 46400 1727204513.64421: sending task result for task 0affcd87-79f5-1303-fda8-000000000091 46400 1727204513.64536: done sending task result for task 0affcd87-79f5-1303-fda8-000000000091 46400 1727204513.64544: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can create a profile ########## 46400 1727204513.64607: no more pending results, returning what we have 46400 1727204513.64612: results queue empty 46400 1727204513.64613: checking for any_errors_fatal 46400 1727204513.64615: done checking for any_errors_fatal 46400 1727204513.64615: checking for max_fail_percentage 46400 1727204513.64617: done checking for max_fail_percentage 46400 1727204513.64618: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.64619: done checking to see if all hosts have failed 46400 1727204513.64620: getting the remaining hosts for this loop 46400 1727204513.64621: done getting the remaining hosts for this loop 46400 1727204513.64625: getting the next task for host managed-node2 46400 1727204513.64633: done getting next task for host managed-node2 46400 1727204513.64636: ^ task is: TASK: Show item 46400 1727204513.64639: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.64643: getting variables 46400 1727204513.64645: in VariableManager get_vars() 46400 1727204513.64680: Calling all_inventory to load vars for managed-node2 46400 1727204513.64683: Calling groups_inventory to load vars for managed-node2 46400 1727204513.64687: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.64699: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.64702: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.64704: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.64894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.65089: done with get_vars() 46400 1727204513.65099: done getting variables 46400 1727204513.65150: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.031) 0:00:03.936 ***** 46400 1727204513.65186: entering _queue_task() for managed-node2/debug 46400 1727204513.65686: worker is 1 (out of 1 available) 46400 1727204513.65699: exiting _queue_task() for managed-node2/debug 46400 1727204513.65709: done queuing things up, now waiting for results queue to drain 46400 1727204513.65711: waiting for pending results... 46400 1727204513.65962: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204513.66071: in run() - task 0affcd87-79f5-1303-fda8-000000000092 46400 1727204513.66094: variable 'ansible_search_path' from source: unknown 46400 1727204513.66101: variable 'ansible_search_path' from source: unknown 46400 1727204513.66155: variable 'omit' from source: magic vars 46400 1727204513.66290: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.66311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.66324: variable 'omit' from source: magic vars 46400 1727204513.66748: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.66767: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.66779: variable 'omit' from source: magic vars 46400 1727204513.66824: variable 'omit' from source: magic vars 46400 1727204513.66884: variable 'item' from source: unknown 46400 1727204513.66972: variable 'item' from source: unknown 46400 1727204513.66993: variable 'omit' from source: magic vars 46400 1727204513.67039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204513.67088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.67113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204513.67138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.67152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.67194: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.67203: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.67210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.67317: Set connection var ansible_shell_type to sh 46400 1727204513.67331: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.67342: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.67353: Set connection var ansible_connection to ssh 46400 1727204513.67366: Set connection var ansible_pipelining to False 46400 1727204513.67376: Set connection var ansible_timeout to 10 46400 1727204513.67407: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.67414: variable 'ansible_connection' from source: unknown 46400 1727204513.67420: variable 'ansible_module_compression' from source: unknown 46400 1727204513.67426: variable 'ansible_shell_type' from source: unknown 46400 1727204513.67432: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.67437: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.67444: variable 'ansible_pipelining' from source: unknown 46400 1727204513.67449: variable 'ansible_timeout' from source: unknown 46400 1727204513.67463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.67619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.67636: variable 'omit' from source: magic vars 46400 1727204513.67647: starting attempt loop 46400 1727204513.67653: running the handler 46400 1727204513.67707: variable 'lsr_description' from source: include params 46400 1727204513.67791: variable 'lsr_description' from source: include params 46400 1727204513.67805: handler run complete 46400 1727204513.67830: attempt loop complete, returning result 46400 1727204513.67845: variable 'item' from source: unknown 46400 1727204513.67909: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 46400 1727204513.68133: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.68147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.68166: variable 'omit' from source: magic vars 46400 1727204513.68350: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.68366: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.68376: variable 'omit' from source: magic vars 46400 1727204513.68406: variable 'omit' from source: magic vars 46400 1727204513.68450: variable 'item' from source: unknown 46400 1727204513.68528: variable 'item' from source: unknown 46400 1727204513.68548: variable 'omit' from source: magic vars 46400 1727204513.68576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.68590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.68603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.68629: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.68637: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.68644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.68726: Set connection var ansible_shell_type to sh 46400 1727204513.68744: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.68755: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.68769: Set connection var ansible_connection to ssh 46400 1727204513.68779: Set connection var ansible_pipelining to False 46400 1727204513.68788: Set connection var ansible_timeout to 10 46400 1727204513.68813: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.68822: variable 'ansible_connection' from source: unknown 46400 1727204513.68832: variable 'ansible_module_compression' from source: unknown 46400 1727204513.68847: variable 'ansible_shell_type' from source: unknown 46400 1727204513.68855: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.68866: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.68875: variable 'ansible_pipelining' from source: unknown 46400 1727204513.68881: variable 'ansible_timeout' from source: unknown 46400 1727204513.68889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.68997: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.69012: variable 'omit' from source: magic vars 46400 1727204513.69021: starting attempt loop 46400 1727204513.69027: running the handler 46400 1727204513.69069: variable 'lsr_setup' from source: include params 46400 1727204513.69135: variable 'lsr_setup' from source: include params 46400 1727204513.69196: handler run complete 46400 1727204513.69217: attempt loop complete, returning result 46400 1727204513.69236: variable 'item' from source: unknown 46400 1727204513.69314: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 46400 1727204513.69492: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.69506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.69519: variable 'omit' from source: magic vars 46400 1727204513.69694: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.69705: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.69714: variable 'omit' from source: magic vars 46400 1727204513.69733: variable 'omit' from source: magic vars 46400 1727204513.69791: variable 'item' from source: unknown 46400 1727204513.69852: variable 'item' from source: unknown 46400 1727204513.69885: variable 'omit' from source: magic vars 46400 1727204513.69907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.69919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.69930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.69945: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.69952: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.69962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.70044: Set connection var ansible_shell_type to sh 46400 1727204513.70057: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.70072: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.70093: Set connection var ansible_connection to ssh 46400 1727204513.70104: Set connection var ansible_pipelining to False 46400 1727204513.70113: Set connection var ansible_timeout to 10 46400 1727204513.70137: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.70145: variable 'ansible_connection' from source: unknown 46400 1727204513.70152: variable 'ansible_module_compression' from source: unknown 46400 1727204513.70159: variable 'ansible_shell_type' from source: unknown 46400 1727204513.70171: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.70179: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.70194: variable 'ansible_pipelining' from source: unknown 46400 1727204513.70204: variable 'ansible_timeout' from source: unknown 46400 1727204513.70213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.70317: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.70329: variable 'omit' from source: magic vars 46400 1727204513.70337: starting attempt loop 46400 1727204513.70344: running the handler 46400 1727204513.70370: variable 'lsr_test' from source: include params 46400 1727204513.70446: variable 'lsr_test' from source: include params 46400 1727204513.70472: handler run complete 46400 1727204513.70490: attempt loop complete, returning result 46400 1727204513.70511: variable 'item' from source: unknown 46400 1727204513.70588: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 46400 1727204513.70743: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.70756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.70773: variable 'omit' from source: magic vars 46400 1727204513.70942: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.70952: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.70959: variable 'omit' from source: magic vars 46400 1727204513.70983: variable 'omit' from source: magic vars 46400 1727204513.71035: variable 'item' from source: unknown 46400 1727204513.71103: variable 'item' from source: unknown 46400 1727204513.71130: variable 'omit' from source: magic vars 46400 1727204513.71151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.71166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.71177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.71188: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.71194: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.71199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.71266: Set connection var ansible_shell_type to sh 46400 1727204513.71278: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.71285: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.71291: Set connection var ansible_connection to ssh 46400 1727204513.71298: Set connection var ansible_pipelining to False 46400 1727204513.71305: Set connection var ansible_timeout to 10 46400 1727204513.71327: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.71342: variable 'ansible_connection' from source: unknown 46400 1727204513.71348: variable 'ansible_module_compression' from source: unknown 46400 1727204513.71353: variable 'ansible_shell_type' from source: unknown 46400 1727204513.71358: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.71366: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.71373: variable 'ansible_pipelining' from source: unknown 46400 1727204513.71378: variable 'ansible_timeout' from source: unknown 46400 1727204513.71383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.71473: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.71485: variable 'omit' from source: magic vars 46400 1727204513.71492: starting attempt loop 46400 1727204513.71498: running the handler 46400 1727204513.71518: variable 'lsr_assert' from source: include params 46400 1727204513.71594: variable 'lsr_assert' from source: include params 46400 1727204513.71614: handler run complete 46400 1727204513.71633: attempt loop complete, returning result 46400 1727204513.71655: variable 'item' from source: unknown 46400 1727204513.71732: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 46400 1727204513.71947: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.71959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.71977: variable 'omit' from source: magic vars 46400 1727204513.72148: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.72159: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.72177: variable 'omit' from source: magic vars 46400 1727204513.72194: variable 'omit' from source: magic vars 46400 1727204513.72244: variable 'item' from source: unknown 46400 1727204513.72313: variable 'item' from source: unknown 46400 1727204513.72340: variable 'omit' from source: magic vars 46400 1727204513.72363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.72377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.72386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.72399: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.72406: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.72413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.72494: Set connection var ansible_shell_type to sh 46400 1727204513.72507: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.72515: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.72523: Set connection var ansible_connection to ssh 46400 1727204513.72531: Set connection var ansible_pipelining to False 46400 1727204513.72548: Set connection var ansible_timeout to 10 46400 1727204513.72578: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.72585: variable 'ansible_connection' from source: unknown 46400 1727204513.72591: variable 'ansible_module_compression' from source: unknown 46400 1727204513.72597: variable 'ansible_shell_type' from source: unknown 46400 1727204513.72602: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.72608: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.72614: variable 'ansible_pipelining' from source: unknown 46400 1727204513.72620: variable 'ansible_timeout' from source: unknown 46400 1727204513.72626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.72721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.72733: variable 'omit' from source: magic vars 46400 1727204513.72740: starting attempt loop 46400 1727204513.72746: running the handler 46400 1727204513.72780: variable 'lsr_assert_when' from source: include params 46400 1727204513.72845: variable 'lsr_assert_when' from source: include params 46400 1727204513.72947: variable 'network_provider' from source: set_fact 46400 1727204513.72996: handler run complete 46400 1727204513.73015: attempt loop complete, returning result 46400 1727204513.73032: variable 'item' from source: unknown 46400 1727204513.73108: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 46400 1727204513.73263: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.73278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.73290: variable 'omit' from source: magic vars 46400 1727204513.73456: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.73471: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.73479: variable 'omit' from source: magic vars 46400 1727204513.73496: variable 'omit' from source: magic vars 46400 1727204513.73547: variable 'item' from source: unknown 46400 1727204513.73614: variable 'item' from source: unknown 46400 1727204513.73634: variable 'omit' from source: magic vars 46400 1727204513.73667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.73679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.73688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.73702: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.73708: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.73715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.73796: Set connection var ansible_shell_type to sh 46400 1727204513.73809: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.73817: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.73825: Set connection var ansible_connection to ssh 46400 1727204513.73833: Set connection var ansible_pipelining to False 46400 1727204513.73841: Set connection var ansible_timeout to 10 46400 1727204513.73878: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.73885: variable 'ansible_connection' from source: unknown 46400 1727204513.73891: variable 'ansible_module_compression' from source: unknown 46400 1727204513.73897: variable 'ansible_shell_type' from source: unknown 46400 1727204513.73902: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.73908: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.73915: variable 'ansible_pipelining' from source: unknown 46400 1727204513.73921: variable 'ansible_timeout' from source: unknown 46400 1727204513.73928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.74022: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.74034: variable 'omit' from source: magic vars 46400 1727204513.74043: starting attempt loop 46400 1727204513.74051: running the handler 46400 1727204513.74087: variable 'lsr_fail_debug' from source: play vars 46400 1727204513.74151: variable 'lsr_fail_debug' from source: play vars 46400 1727204513.74176: handler run complete 46400 1727204513.74203: attempt loop complete, returning result 46400 1727204513.74222: variable 'item' from source: unknown 46400 1727204513.74286: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204513.74421: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.74432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.74444: variable 'omit' from source: magic vars 46400 1727204513.74612: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.74633: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.74642: variable 'omit' from source: magic vars 46400 1727204513.74665: variable 'omit' from source: magic vars 46400 1727204513.74712: variable 'item' from source: unknown 46400 1727204513.74791: variable 'item' from source: unknown 46400 1727204513.74811: variable 'omit' from source: magic vars 46400 1727204513.74832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.74862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.74876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.74891: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.74899: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.74905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.74988: Set connection var ansible_shell_type to sh 46400 1727204513.75001: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.75011: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.75021: Set connection var ansible_connection to ssh 46400 1727204513.75029: Set connection var ansible_pipelining to False 46400 1727204513.75038: Set connection var ansible_timeout to 10 46400 1727204513.75075: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.75084: variable 'ansible_connection' from source: unknown 46400 1727204513.75090: variable 'ansible_module_compression' from source: unknown 46400 1727204513.75097: variable 'ansible_shell_type' from source: unknown 46400 1727204513.75104: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.75110: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.75118: variable 'ansible_pipelining' from source: unknown 46400 1727204513.75125: variable 'ansible_timeout' from source: unknown 46400 1727204513.75133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.75234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.75249: variable 'omit' from source: magic vars 46400 1727204513.75258: starting attempt loop 46400 1727204513.75269: running the handler 46400 1727204513.75304: variable 'lsr_cleanup' from source: include params 46400 1727204513.75417: variable 'lsr_cleanup' from source: include params 46400 1727204513.75437: handler run complete 46400 1727204513.75455: attempt loop complete, returning result 46400 1727204513.75480: variable 'item' from source: unknown 46400 1727204513.75554: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 46400 1727204513.75673: dumping result to json 46400 1727204513.75688: done dumping result, returning 46400 1727204513.75699: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-000000000092] 46400 1727204513.75710: sending task result for task 0affcd87-79f5-1303-fda8-000000000092 46400 1727204513.75844: no more pending results, returning what we have 46400 1727204513.75849: results queue empty 46400 1727204513.75850: checking for any_errors_fatal 46400 1727204513.75855: done checking for any_errors_fatal 46400 1727204513.75856: checking for max_fail_percentage 46400 1727204513.75858: done checking for max_fail_percentage 46400 1727204513.75858: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.75859: done checking to see if all hosts have failed 46400 1727204513.75863: getting the remaining hosts for this loop 46400 1727204513.75866: done getting the remaining hosts for this loop 46400 1727204513.75870: getting the next task for host managed-node2 46400 1727204513.75878: done getting next task for host managed-node2 46400 1727204513.75881: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204513.75884: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.75887: getting variables 46400 1727204513.75889: in VariableManager get_vars() 46400 1727204513.75914: Calling all_inventory to load vars for managed-node2 46400 1727204513.75917: Calling groups_inventory to load vars for managed-node2 46400 1727204513.75921: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.75932: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.75935: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.75938: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.76108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.76312: done with get_vars() 46400 1727204513.76321: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.112) 0:00:04.048 ***** 46400 1727204513.76430: entering _queue_task() for managed-node2/include_tasks 46400 1727204513.76450: done sending task result for task 0affcd87-79f5-1303-fda8-000000000092 46400 1727204513.76459: WORKER PROCESS EXITING 46400 1727204513.77013: worker is 1 (out of 1 available) 46400 1727204513.77025: exiting _queue_task() for managed-node2/include_tasks 46400 1727204513.77044: done queuing things up, now waiting for results queue to drain 46400 1727204513.77046: waiting for pending results... 46400 1727204513.77303: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204513.77411: in run() - task 0affcd87-79f5-1303-fda8-000000000093 46400 1727204513.77428: variable 'ansible_search_path' from source: unknown 46400 1727204513.77435: variable 'ansible_search_path' from source: unknown 46400 1727204513.77480: calling self._execute() 46400 1727204513.77566: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.77578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.77599: variable 'omit' from source: magic vars 46400 1727204513.77975: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.77993: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.78002: _execute() done 46400 1727204513.78008: dumping result to json 46400 1727204513.78013: done dumping result, returning 46400 1727204513.78020: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-000000000093] 46400 1727204513.78037: sending task result for task 0affcd87-79f5-1303-fda8-000000000093 46400 1727204513.78126: done sending task result for task 0affcd87-79f5-1303-fda8-000000000093 46400 1727204513.78132: WORKER PROCESS EXITING 46400 1727204513.78167: no more pending results, returning what we have 46400 1727204513.78172: in VariableManager get_vars() 46400 1727204513.78205: Calling all_inventory to load vars for managed-node2 46400 1727204513.78209: Calling groups_inventory to load vars for managed-node2 46400 1727204513.78212: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.78226: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.78228: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.78231: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.78455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.78672: done with get_vars() 46400 1727204513.78680: variable 'ansible_search_path' from source: unknown 46400 1727204513.78681: variable 'ansible_search_path' from source: unknown 46400 1727204513.78728: we have included files to process 46400 1727204513.78729: generating all_blocks data 46400 1727204513.78731: done generating all_blocks data 46400 1727204513.78739: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204513.78740: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204513.78742: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204513.79147: in VariableManager get_vars() 46400 1727204513.79169: done with get_vars() 46400 1727204513.79286: done processing included file 46400 1727204513.79288: iterating over new_blocks loaded from include file 46400 1727204513.79289: in VariableManager get_vars() 46400 1727204513.79301: done with get_vars() 46400 1727204513.79303: filtering new block on tags 46400 1727204513.79341: done filtering new block on tags 46400 1727204513.79343: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204513.79349: extending task lists for all hosts with included blocks 46400 1727204513.79838: done extending task lists 46400 1727204513.79839: done processing included files 46400 1727204513.79840: results queue empty 46400 1727204513.79841: checking for any_errors_fatal 46400 1727204513.79845: done checking for any_errors_fatal 46400 1727204513.79846: checking for max_fail_percentage 46400 1727204513.79847: done checking for max_fail_percentage 46400 1727204513.79848: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.79849: done checking to see if all hosts have failed 46400 1727204513.79850: getting the remaining hosts for this loop 46400 1727204513.79851: done getting the remaining hosts for this loop 46400 1727204513.79853: getting the next task for host managed-node2 46400 1727204513.79857: done getting next task for host managed-node2 46400 1727204513.79862: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204513.79866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.79869: getting variables 46400 1727204513.79870: in VariableManager get_vars() 46400 1727204513.79878: Calling all_inventory to load vars for managed-node2 46400 1727204513.79880: Calling groups_inventory to load vars for managed-node2 46400 1727204513.79888: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.79893: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.79896: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.79898: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.80048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.80288: done with get_vars() 46400 1727204513.80297: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.039) 0:00:04.088 ***** 46400 1727204513.80379: entering _queue_task() for managed-node2/include_tasks 46400 1727204513.80688: worker is 1 (out of 1 available) 46400 1727204513.80701: exiting _queue_task() for managed-node2/include_tasks 46400 1727204513.80714: done queuing things up, now waiting for results queue to drain 46400 1727204513.80715: waiting for pending results... 46400 1727204513.80967: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204513.81102: in run() - task 0affcd87-79f5-1303-fda8-0000000000ba 46400 1727204513.81119: variable 'ansible_search_path' from source: unknown 46400 1727204513.81127: variable 'ansible_search_path' from source: unknown 46400 1727204513.81176: calling self._execute() 46400 1727204513.81271: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.81287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.81303: variable 'omit' from source: magic vars 46400 1727204513.81717: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.81734: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.81754: _execute() done 46400 1727204513.81766: dumping result to json 46400 1727204513.81776: done dumping result, returning 46400 1727204513.81786: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-0000000000ba] 46400 1727204513.81797: sending task result for task 0affcd87-79f5-1303-fda8-0000000000ba 46400 1727204513.81912: done sending task result for task 0affcd87-79f5-1303-fda8-0000000000ba 46400 1727204513.81942: no more pending results, returning what we have 46400 1727204513.81947: in VariableManager get_vars() 46400 1727204513.81987: Calling all_inventory to load vars for managed-node2 46400 1727204513.81990: Calling groups_inventory to load vars for managed-node2 46400 1727204513.81995: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.82011: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.82015: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.82018: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.82211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.82402: done with get_vars() 46400 1727204513.82410: variable 'ansible_search_path' from source: unknown 46400 1727204513.82411: variable 'ansible_search_path' from source: unknown 46400 1727204513.82449: we have included files to process 46400 1727204513.82451: generating all_blocks data 46400 1727204513.82453: done generating all_blocks data 46400 1727204513.82455: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204513.82456: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204513.82458: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204513.82886: WORKER PROCESS EXITING 46400 1727204513.83066: done processing included file 46400 1727204513.83068: iterating over new_blocks loaded from include file 46400 1727204513.83070: in VariableManager get_vars() 46400 1727204513.83084: done with get_vars() 46400 1727204513.83086: filtering new block on tags 46400 1727204513.83121: done filtering new block on tags 46400 1727204513.83124: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204513.83129: extending task lists for all hosts with included blocks 46400 1727204513.83302: done extending task lists 46400 1727204513.83303: done processing included files 46400 1727204513.83304: results queue empty 46400 1727204513.83305: checking for any_errors_fatal 46400 1727204513.83307: done checking for any_errors_fatal 46400 1727204513.83308: checking for max_fail_percentage 46400 1727204513.83309: done checking for max_fail_percentage 46400 1727204513.83310: checking to see if all hosts have failed and the running result is not ok 46400 1727204513.83311: done checking to see if all hosts have failed 46400 1727204513.83312: getting the remaining hosts for this loop 46400 1727204513.83313: done getting the remaining hosts for this loop 46400 1727204513.83315: getting the next task for host managed-node2 46400 1727204513.83320: done getting next task for host managed-node2 46400 1727204513.83322: ^ task is: TASK: Gather current interface info 46400 1727204513.83325: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204513.83327: getting variables 46400 1727204513.83328: in VariableManager get_vars() 46400 1727204513.83336: Calling all_inventory to load vars for managed-node2 46400 1727204513.83338: Calling groups_inventory to load vars for managed-node2 46400 1727204513.83341: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204513.83345: Calling all_plugins_play to load vars for managed-node2 46400 1727204513.83348: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204513.83351: Calling groups_plugins_play to load vars for managed-node2 46400 1727204513.83518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204513.83701: done with get_vars() 46400 1727204513.83710: done getting variables 46400 1727204513.83746: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.033) 0:00:04.122 ***** 46400 1727204513.83780: entering _queue_task() for managed-node2/command 46400 1727204513.84024: worker is 1 (out of 1 available) 46400 1727204513.84037: exiting _queue_task() for managed-node2/command 46400 1727204513.84050: done queuing things up, now waiting for results queue to drain 46400 1727204513.84052: waiting for pending results... 46400 1727204513.84303: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204513.84415: in run() - task 0affcd87-79f5-1303-fda8-0000000000f5 46400 1727204513.84433: variable 'ansible_search_path' from source: unknown 46400 1727204513.84442: variable 'ansible_search_path' from source: unknown 46400 1727204513.84485: calling self._execute() 46400 1727204513.84575: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.84588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.84607: variable 'omit' from source: magic vars 46400 1727204513.84943: variable 'ansible_distribution_major_version' from source: facts 46400 1727204513.84958: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204513.84973: variable 'omit' from source: magic vars 46400 1727204513.85021: variable 'omit' from source: magic vars 46400 1727204513.85067: variable 'omit' from source: magic vars 46400 1727204513.85110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204513.85152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204513.85182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204513.85204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.85217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204513.85249: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204513.85259: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.85272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.85370: Set connection var ansible_shell_type to sh 46400 1727204513.85386: Set connection var ansible_shell_executable to /bin/sh 46400 1727204513.85395: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204513.85402: Set connection var ansible_connection to ssh 46400 1727204513.85410: Set connection var ansible_pipelining to False 46400 1727204513.85417: Set connection var ansible_timeout to 10 46400 1727204513.85439: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.85444: variable 'ansible_connection' from source: unknown 46400 1727204513.85451: variable 'ansible_module_compression' from source: unknown 46400 1727204513.85456: variable 'ansible_shell_type' from source: unknown 46400 1727204513.85466: variable 'ansible_shell_executable' from source: unknown 46400 1727204513.85474: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204513.85480: variable 'ansible_pipelining' from source: unknown 46400 1727204513.85485: variable 'ansible_timeout' from source: unknown 46400 1727204513.85491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204513.85633: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204513.85650: variable 'omit' from source: magic vars 46400 1727204513.85663: starting attempt loop 46400 1727204513.85673: running the handler 46400 1727204513.85696: _low_level_execute_command(): starting 46400 1727204513.85709: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204513.86488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204513.86504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.86520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.86540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.86591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.86604: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204513.86619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.86639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204513.86651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204513.86668: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204513.86686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.86701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.86717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.86730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.86742: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204513.86757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.86840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204513.86870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204513.86889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204513.86980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204513.89280: stdout chunk (state=3): >>>/root <<< 46400 1727204513.89441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204513.89526: stderr chunk (state=3): >>><<< 46400 1727204513.89531: stdout chunk (state=3): >>><<< 46400 1727204513.89654: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204513.89657: _low_level_execute_command(): starting 46400 1727204513.89662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681 `" && echo ansible-tmp-1727204513.8955426-46650-88378212664681="` echo /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681 `" ) && sleep 0' 46400 1727204513.90258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204513.90276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.90290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.90309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.90351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.90371: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204513.90387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.90406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204513.90423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204513.90435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204513.90449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204513.90468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204513.90485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204513.90497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204513.90508: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204513.90521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204513.90605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204513.90628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204513.90650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204513.90756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204513.93386: stdout chunk (state=3): >>>ansible-tmp-1727204513.8955426-46650-88378212664681=/root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681 <<< 46400 1727204513.93557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204513.93651: stderr chunk (state=3): >>><<< 46400 1727204513.93662: stdout chunk (state=3): >>><<< 46400 1727204513.93770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204513.8955426-46650-88378212664681=/root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204513.93773: variable 'ansible_module_compression' from source: unknown 46400 1727204513.93901: ANSIBALLZ: Using generic lock for ansible.legacy.command 46400 1727204513.93905: ANSIBALLZ: Acquiring lock 46400 1727204513.93907: ANSIBALLZ: Lock acquired: 140519374124768 46400 1727204513.93909: ANSIBALLZ: Creating module 46400 1727204514.12014: ANSIBALLZ: Writing module into payload 46400 1727204514.12108: ANSIBALLZ: Writing module 46400 1727204514.12126: ANSIBALLZ: Renaming module 46400 1727204514.12129: ANSIBALLZ: Done creating module 46400 1727204514.12160: variable 'ansible_facts' from source: unknown 46400 1727204514.12211: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/AnsiballZ_command.py 46400 1727204514.12321: Sending initial data 46400 1727204514.12324: Sent initial data (155 bytes) 46400 1727204514.13225: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204514.13235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.13251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.13271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.13309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.13341: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204514.13344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.13347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204514.13349: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204514.13352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204514.13361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.13380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.13396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.13409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.13422: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204514.13436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.13522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.13539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.13554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.13630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 46400 1727204514.15610: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204514.15641: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204514.15699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpatb7ny08 /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/AnsiballZ_command.py <<< 46400 1727204514.15721: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204514.16869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.16873: stdout chunk (state=3): >>><<< 46400 1727204514.16876: stderr chunk (state=3): >>><<< 46400 1727204514.16878: done transferring module to remote 46400 1727204514.16880: _low_level_execute_command(): starting 46400 1727204514.16882: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/ /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/AnsiballZ_command.py && sleep 0' 46400 1727204514.17514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204514.17536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.17552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.17575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.17848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.17860: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204514.17879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.17899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204514.17925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204514.17939: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204514.17952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.17970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.17989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.18001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.18012: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204514.18036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.18113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.18136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.18162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.18232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.20600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.20642: stderr chunk (state=3): >>><<< 46400 1727204514.20645: stdout chunk (state=3): >>><<< 46400 1727204514.20660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204514.20667: _low_level_execute_command(): starting 46400 1727204514.20673: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/AnsiballZ_command.py && sleep 0' 46400 1727204514.21158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.21169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.21211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.21215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.21218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.21272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.21283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.21338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.42452: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:54.420564", "end": "2024-09-24 15:01:54.423721", "delta": "0:00:00.003157", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204514.43759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204514.43818: stderr chunk (state=3): >>><<< 46400 1727204514.43822: stdout chunk (state=3): >>><<< 46400 1727204514.43840: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:54.420564", "end": "2024-09-24 15:01:54.423721", "delta": "0:00:00.003157", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204514.43873: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204514.43880: _low_level_execute_command(): starting 46400 1727204514.43885: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204513.8955426-46650-88378212664681/ > /dev/null 2>&1 && sleep 0' 46400 1727204514.44345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.44349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.44391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.44394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.44404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.44414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.44419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.44479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.44483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.44496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.44553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.46729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.46778: stderr chunk (state=3): >>><<< 46400 1727204514.46782: stdout chunk (state=3): >>><<< 46400 1727204514.46795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204514.46801: handler run complete 46400 1727204514.46824: Evaluated conditional (False): False 46400 1727204514.46834: attempt loop complete, returning result 46400 1727204514.46837: _execute() done 46400 1727204514.46840: dumping result to json 46400 1727204514.46845: done dumping result, returning 46400 1727204514.46862: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-0000000000f5] 46400 1727204514.46872: sending task result for task 0affcd87-79f5-1303-fda8-0000000000f5 46400 1727204514.46965: done sending task result for task 0affcd87-79f5-1303-fda8-0000000000f5 46400 1727204514.46968: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003157", "end": "2024-09-24 15:01:54.423721", "rc": 0, "start": "2024-09-24 15:01:54.420564" } STDOUT: bonding_masters eth0 lo 46400 1727204514.47033: no more pending results, returning what we have 46400 1727204514.47037: results queue empty 46400 1727204514.47038: checking for any_errors_fatal 46400 1727204514.47039: done checking for any_errors_fatal 46400 1727204514.47040: checking for max_fail_percentage 46400 1727204514.47041: done checking for max_fail_percentage 46400 1727204514.47042: checking to see if all hosts have failed and the running result is not ok 46400 1727204514.47043: done checking to see if all hosts have failed 46400 1727204514.47044: getting the remaining hosts for this loop 46400 1727204514.47045: done getting the remaining hosts for this loop 46400 1727204514.47049: getting the next task for host managed-node2 46400 1727204514.47056: done getting next task for host managed-node2 46400 1727204514.47058: ^ task is: TASK: Set current_interfaces 46400 1727204514.47066: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204514.47069: getting variables 46400 1727204514.47071: in VariableManager get_vars() 46400 1727204514.47149: Calling all_inventory to load vars for managed-node2 46400 1727204514.47152: Calling groups_inventory to load vars for managed-node2 46400 1727204514.47157: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204514.47174: Calling all_plugins_play to load vars for managed-node2 46400 1727204514.47177: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204514.47180: Calling groups_plugins_play to load vars for managed-node2 46400 1727204514.47327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204514.47466: done with get_vars() 46400 1727204514.47474: done getting variables 46400 1727204514.47519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.637) 0:00:04.760 ***** 46400 1727204514.47546: entering _queue_task() for managed-node2/set_fact 46400 1727204514.47743: worker is 1 (out of 1 available) 46400 1727204514.47755: exiting _queue_task() for managed-node2/set_fact 46400 1727204514.47772: done queuing things up, now waiting for results queue to drain 46400 1727204514.47773: waiting for pending results... 46400 1727204514.47938: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204514.48054: in run() - task 0affcd87-79f5-1303-fda8-0000000000f6 46400 1727204514.48076: variable 'ansible_search_path' from source: unknown 46400 1727204514.48089: variable 'ansible_search_path' from source: unknown 46400 1727204514.48128: calling self._execute() 46400 1727204514.48215: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.48227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.48241: variable 'omit' from source: magic vars 46400 1727204514.50419: variable 'ansible_distribution_major_version' from source: facts 46400 1727204514.50444: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204514.50585: variable 'omit' from source: magic vars 46400 1727204514.50642: variable 'omit' from source: magic vars 46400 1727204514.50983: variable '_current_interfaces' from source: set_fact 46400 1727204514.51083: variable 'omit' from source: magic vars 46400 1727204514.51146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204514.51192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204514.51231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204514.51258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.51277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.51313: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204514.51325: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.51337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.51437: Set connection var ansible_shell_type to sh 46400 1727204514.51459: Set connection var ansible_shell_executable to /bin/sh 46400 1727204514.51472: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204514.51482: Set connection var ansible_connection to ssh 46400 1727204514.51490: Set connection var ansible_pipelining to False 46400 1727204514.51499: Set connection var ansible_timeout to 10 46400 1727204514.51527: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.51540: variable 'ansible_connection' from source: unknown 46400 1727204514.51548: variable 'ansible_module_compression' from source: unknown 46400 1727204514.51559: variable 'ansible_shell_type' from source: unknown 46400 1727204514.51571: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.51578: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.51584: variable 'ansible_pipelining' from source: unknown 46400 1727204514.51590: variable 'ansible_timeout' from source: unknown 46400 1727204514.51596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.51744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204514.51766: variable 'omit' from source: magic vars 46400 1727204514.51784: starting attempt loop 46400 1727204514.51790: running the handler 46400 1727204514.51804: handler run complete 46400 1727204514.51817: attempt loop complete, returning result 46400 1727204514.51823: _execute() done 46400 1727204514.51829: dumping result to json 46400 1727204514.51835: done dumping result, returning 46400 1727204514.51844: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-0000000000f6] 46400 1727204514.51853: sending task result for task 0affcd87-79f5-1303-fda8-0000000000f6 46400 1727204514.51968: done sending task result for task 0affcd87-79f5-1303-fda8-0000000000f6 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204514.52037: no more pending results, returning what we have 46400 1727204514.52041: results queue empty 46400 1727204514.52043: checking for any_errors_fatal 46400 1727204514.52050: done checking for any_errors_fatal 46400 1727204514.52051: checking for max_fail_percentage 46400 1727204514.52054: done checking for max_fail_percentage 46400 1727204514.52055: checking to see if all hosts have failed and the running result is not ok 46400 1727204514.52056: done checking to see if all hosts have failed 46400 1727204514.52056: getting the remaining hosts for this loop 46400 1727204514.52059: done getting the remaining hosts for this loop 46400 1727204514.52064: getting the next task for host managed-node2 46400 1727204514.52078: done getting next task for host managed-node2 46400 1727204514.52081: ^ task is: TASK: Show current_interfaces 46400 1727204514.52085: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204514.52090: getting variables 46400 1727204514.52091: in VariableManager get_vars() 46400 1727204514.52128: Calling all_inventory to load vars for managed-node2 46400 1727204514.52131: Calling groups_inventory to load vars for managed-node2 46400 1727204514.52136: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204514.52150: Calling all_plugins_play to load vars for managed-node2 46400 1727204514.52153: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204514.52157: Calling groups_plugins_play to load vars for managed-node2 46400 1727204514.52356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204514.52580: done with get_vars() 46400 1727204514.52591: done getting variables 46400 1727204514.52728: WORKER PROCESS EXITING 46400 1727204514.52773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.068) 0:00:04.828 ***** 46400 1727204514.54381: entering _queue_task() for managed-node2/debug 46400 1727204514.55187: worker is 1 (out of 1 available) 46400 1727204514.55205: exiting _queue_task() for managed-node2/debug 46400 1727204514.55217: done queuing things up, now waiting for results queue to drain 46400 1727204514.55220: waiting for pending results... 46400 1727204514.56060: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204514.56286: in run() - task 0affcd87-79f5-1303-fda8-0000000000bb 46400 1727204514.56308: variable 'ansible_search_path' from source: unknown 46400 1727204514.56355: variable 'ansible_search_path' from source: unknown 46400 1727204514.56401: calling self._execute() 46400 1727204514.56598: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.56634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.56683: variable 'omit' from source: magic vars 46400 1727204514.57427: variable 'ansible_distribution_major_version' from source: facts 46400 1727204514.57486: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204514.57544: variable 'omit' from source: magic vars 46400 1727204514.57593: variable 'omit' from source: magic vars 46400 1727204514.57803: variable 'current_interfaces' from source: set_fact 46400 1727204514.57898: variable 'omit' from source: magic vars 46400 1727204514.57981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204514.58082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204514.58180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204514.58205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.58219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.58293: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204514.58303: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.58312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.58462: Set connection var ansible_shell_type to sh 46400 1727204514.58539: Set connection var ansible_shell_executable to /bin/sh 46400 1727204514.58598: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204514.58609: Set connection var ansible_connection to ssh 46400 1727204514.58618: Set connection var ansible_pipelining to False 46400 1727204514.58627: Set connection var ansible_timeout to 10 46400 1727204514.58660: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.58704: variable 'ansible_connection' from source: unknown 46400 1727204514.58712: variable 'ansible_module_compression' from source: unknown 46400 1727204514.58719: variable 'ansible_shell_type' from source: unknown 46400 1727204514.58726: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.58750: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.58758: variable 'ansible_pipelining' from source: unknown 46400 1727204514.58812: variable 'ansible_timeout' from source: unknown 46400 1727204514.58822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.59091: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204514.59145: variable 'omit' from source: magic vars 46400 1727204514.59155: starting attempt loop 46400 1727204514.59185: running the handler 46400 1727204514.59232: handler run complete 46400 1727204514.59306: attempt loop complete, returning result 46400 1727204514.59356: _execute() done 46400 1727204514.59365: dumping result to json 46400 1727204514.59373: done dumping result, returning 46400 1727204514.59398: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-0000000000bb] 46400 1727204514.59412: sending task result for task 0affcd87-79f5-1303-fda8-0000000000bb ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204514.59620: no more pending results, returning what we have 46400 1727204514.59624: results queue empty 46400 1727204514.59625: checking for any_errors_fatal 46400 1727204514.59630: done checking for any_errors_fatal 46400 1727204514.59630: checking for max_fail_percentage 46400 1727204514.59632: done checking for max_fail_percentage 46400 1727204514.59633: checking to see if all hosts have failed and the running result is not ok 46400 1727204514.59634: done checking to see if all hosts have failed 46400 1727204514.59635: getting the remaining hosts for this loop 46400 1727204514.59637: done getting the remaining hosts for this loop 46400 1727204514.59642: getting the next task for host managed-node2 46400 1727204514.59651: done getting next task for host managed-node2 46400 1727204514.59654: ^ task is: TASK: Setup 46400 1727204514.59657: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204514.59661: getting variables 46400 1727204514.59665: in VariableManager get_vars() 46400 1727204514.59697: Calling all_inventory to load vars for managed-node2 46400 1727204514.59702: Calling groups_inventory to load vars for managed-node2 46400 1727204514.59706: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204514.59718: Calling all_plugins_play to load vars for managed-node2 46400 1727204514.59721: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204514.59724: Calling groups_plugins_play to load vars for managed-node2 46400 1727204514.59898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204514.60140: done with get_vars() 46400 1727204514.60151: done getting variables 46400 1727204514.60298: done sending task result for task 0affcd87-79f5-1303-fda8-0000000000bb 46400 1727204514.60302: WORKER PROCESS EXITING TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.059) 0:00:04.889 ***** 46400 1727204514.60521: entering _queue_task() for managed-node2/include_tasks 46400 1727204514.61177: worker is 1 (out of 1 available) 46400 1727204514.61189: exiting _queue_task() for managed-node2/include_tasks 46400 1727204514.61201: done queuing things up, now waiting for results queue to drain 46400 1727204514.61202: waiting for pending results... 46400 1727204514.62018: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204514.62237: in run() - task 0affcd87-79f5-1303-fda8-000000000094 46400 1727204514.62257: variable 'ansible_search_path' from source: unknown 46400 1727204514.62273: variable 'ansible_search_path' from source: unknown 46400 1727204514.62418: variable 'lsr_setup' from source: include params 46400 1727204514.62740: variable 'lsr_setup' from source: include params 46400 1727204514.62820: variable 'omit' from source: magic vars 46400 1727204514.62950: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.62966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.62982: variable 'omit' from source: magic vars 46400 1727204514.63264: variable 'ansible_distribution_major_version' from source: facts 46400 1727204514.63288: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204514.63300: variable 'item' from source: unknown 46400 1727204514.63401: variable 'item' from source: unknown 46400 1727204514.63451: variable 'item' from source: unknown 46400 1727204514.63544: variable 'item' from source: unknown 46400 1727204514.63745: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.63758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.63774: variable 'omit' from source: magic vars 46400 1727204514.63940: variable 'ansible_distribution_major_version' from source: facts 46400 1727204514.63951: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204514.63960: variable 'item' from source: unknown 46400 1727204514.64030: variable 'item' from source: unknown 46400 1727204514.64063: variable 'item' from source: unknown 46400 1727204514.64131: variable 'item' from source: unknown 46400 1727204514.64210: dumping result to json 46400 1727204514.64219: done dumping result, returning 46400 1727204514.64238: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-000000000094] 46400 1727204514.64247: sending task result for task 0affcd87-79f5-1303-fda8-000000000094 46400 1727204514.64311: done sending task result for task 0affcd87-79f5-1303-fda8-000000000094 46400 1727204514.64318: WORKER PROCESS EXITING 46400 1727204514.64359: no more pending results, returning what we have 46400 1727204514.64366: in VariableManager get_vars() 46400 1727204514.64405: Calling all_inventory to load vars for managed-node2 46400 1727204514.64409: Calling groups_inventory to load vars for managed-node2 46400 1727204514.64413: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204514.64429: Calling all_plugins_play to load vars for managed-node2 46400 1727204514.64432: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204514.64436: Calling groups_plugins_play to load vars for managed-node2 46400 1727204514.64626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204514.64854: done with get_vars() 46400 1727204514.64865: variable 'ansible_search_path' from source: unknown 46400 1727204514.64866: variable 'ansible_search_path' from source: unknown 46400 1727204514.64916: variable 'ansible_search_path' from source: unknown 46400 1727204514.64917: variable 'ansible_search_path' from source: unknown 46400 1727204514.64949: we have included files to process 46400 1727204514.64951: generating all_blocks data 46400 1727204514.64953: done generating all_blocks data 46400 1727204514.64960: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204514.64961: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204514.64963: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204514.65461: done processing included file 46400 1727204514.65463: iterating over new_blocks loaded from include file 46400 1727204514.65466: in VariableManager get_vars() 46400 1727204514.65482: done with get_vars() 46400 1727204514.65483: filtering new block on tags 46400 1727204514.65508: done filtering new block on tags 46400 1727204514.65511: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 => (item=tasks/delete_interface.yml) 46400 1727204514.65515: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204514.65516: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204514.65519: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204514.65671: in VariableManager get_vars() 46400 1727204514.65689: done with get_vars() 46400 1727204514.65801: done processing included file 46400 1727204514.65803: iterating over new_blocks loaded from include file 46400 1727204514.65804: in VariableManager get_vars() 46400 1727204514.65824: done with get_vars() 46400 1727204514.65826: filtering new block on tags 46400 1727204514.66110: done filtering new block on tags 46400 1727204514.66112: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 46400 1727204514.66116: extending task lists for all hosts with included blocks 46400 1727204514.66752: done extending task lists 46400 1727204514.66753: done processing included files 46400 1727204514.66754: results queue empty 46400 1727204514.66755: checking for any_errors_fatal 46400 1727204514.66758: done checking for any_errors_fatal 46400 1727204514.66759: checking for max_fail_percentage 46400 1727204514.66760: done checking for max_fail_percentage 46400 1727204514.66761: checking to see if all hosts have failed and the running result is not ok 46400 1727204514.66761: done checking to see if all hosts have failed 46400 1727204514.66762: getting the remaining hosts for this loop 46400 1727204514.66765: done getting the remaining hosts for this loop 46400 1727204514.66767: getting the next task for host managed-node2 46400 1727204514.66772: done getting next task for host managed-node2 46400 1727204514.66774: ^ task is: TASK: Remove test interface if necessary 46400 1727204514.66777: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204514.66779: getting variables 46400 1727204514.66780: in VariableManager get_vars() 46400 1727204514.66793: Calling all_inventory to load vars for managed-node2 46400 1727204514.66795: Calling groups_inventory to load vars for managed-node2 46400 1727204514.66798: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204514.66804: Calling all_plugins_play to load vars for managed-node2 46400 1727204514.66806: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204514.66809: Calling groups_plugins_play to load vars for managed-node2 46400 1727204514.67035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204514.67322: done with get_vars() 46400 1727204514.67330: done getting variables 46400 1727204514.67373: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.068) 0:00:04.958 ***** 46400 1727204514.67406: entering _queue_task() for managed-node2/command 46400 1727204514.67770: worker is 1 (out of 1 available) 46400 1727204514.67782: exiting _queue_task() for managed-node2/command 46400 1727204514.67795: done queuing things up, now waiting for results queue to drain 46400 1727204514.67797: waiting for pending results... 46400 1727204514.68250: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 46400 1727204514.68352: in run() - task 0affcd87-79f5-1303-fda8-00000000011b 46400 1727204514.68379: variable 'ansible_search_path' from source: unknown 46400 1727204514.68447: variable 'ansible_search_path' from source: unknown 46400 1727204514.68492: calling self._execute() 46400 1727204514.68574: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.68593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.68607: variable 'omit' from source: magic vars 46400 1727204514.69011: variable 'ansible_distribution_major_version' from source: facts 46400 1727204514.69036: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204514.69052: variable 'omit' from source: magic vars 46400 1727204514.69106: variable 'omit' from source: magic vars 46400 1727204514.69218: variable 'interface' from source: play vars 46400 1727204514.69248: variable 'omit' from source: magic vars 46400 1727204514.69299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204514.69340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204514.69385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204514.69408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.69423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204514.69463: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204514.69480: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.69488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.69610: Set connection var ansible_shell_type to sh 46400 1727204514.69632: Set connection var ansible_shell_executable to /bin/sh 46400 1727204514.69650: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204514.69660: Set connection var ansible_connection to ssh 46400 1727204514.69675: Set connection var ansible_pipelining to False 46400 1727204514.69692: Set connection var ansible_timeout to 10 46400 1727204514.69729: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.69737: variable 'ansible_connection' from source: unknown 46400 1727204514.69753: variable 'ansible_module_compression' from source: unknown 46400 1727204514.69760: variable 'ansible_shell_type' from source: unknown 46400 1727204514.69770: variable 'ansible_shell_executable' from source: unknown 46400 1727204514.69791: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204514.69799: variable 'ansible_pipelining' from source: unknown 46400 1727204514.69811: variable 'ansible_timeout' from source: unknown 46400 1727204514.69826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204514.70034: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204514.70059: variable 'omit' from source: magic vars 46400 1727204514.70072: starting attempt loop 46400 1727204514.70084: running the handler 46400 1727204514.70104: _low_level_execute_command(): starting 46400 1727204514.70217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204514.71352: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.71357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.71392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.71395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204514.71397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.71400: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.71469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.71473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.71475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.71541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.73149: stdout chunk (state=3): >>>/root <<< 46400 1727204514.73325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.73329: stdout chunk (state=3): >>><<< 46400 1727204514.73339: stderr chunk (state=3): >>><<< 46400 1727204514.73362: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204514.73388: _low_level_execute_command(): starting 46400 1727204514.73392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753 `" && echo ansible-tmp-1727204514.733715-46700-176295042942753="` echo /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753 `" ) && sleep 0' 46400 1727204514.74968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204514.75035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.75053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.75076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.75133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.75257: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204514.75276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.75296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204514.75309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204514.75321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204514.75334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.75352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.75376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.75390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.75402: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204514.75418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.75606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.75623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.75639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.75804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.77597: stdout chunk (state=3): >>>ansible-tmp-1727204514.733715-46700-176295042942753=/root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753 <<< 46400 1727204514.77798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.77801: stdout chunk (state=3): >>><<< 46400 1727204514.77804: stderr chunk (state=3): >>><<< 46400 1727204514.78071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204514.733715-46700-176295042942753=/root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204514.78075: variable 'ansible_module_compression' from source: unknown 46400 1727204514.78078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204514.78080: variable 'ansible_facts' from source: unknown 46400 1727204514.78082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/AnsiballZ_command.py 46400 1727204514.78237: Sending initial data 46400 1727204514.78247: Sent initial data (155 bytes) 46400 1727204514.79303: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.79306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.79335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.79339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.79349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.79425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.79428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.79431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.79479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.81192: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204514.81219: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204514.81258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp8dpl0pin /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/AnsiballZ_command.py <<< 46400 1727204514.81294: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204514.82687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.82853: stderr chunk (state=3): >>><<< 46400 1727204514.82856: stdout chunk (state=3): >>><<< 46400 1727204514.82858: done transferring module to remote 46400 1727204514.82860: _low_level_execute_command(): starting 46400 1727204514.82862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/ /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/AnsiballZ_command.py && sleep 0' 46400 1727204514.83735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.83739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.83788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204514.83791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204514.83793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.83799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.83801: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.83866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.83870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.83921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204514.85623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204514.85710: stderr chunk (state=3): >>><<< 46400 1727204514.85714: stdout chunk (state=3): >>><<< 46400 1727204514.85806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204514.85809: _low_level_execute_command(): starting 46400 1727204514.85812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/AnsiballZ_command.py && sleep 0' 46400 1727204514.86376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204514.86390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.86405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.86424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.86469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.86483: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204514.86498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.86515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204514.86526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204514.86536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204514.86548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204514.86567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204514.86584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204514.86596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204514.86608: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204514.86622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204514.86702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204514.86719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204514.86734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204514.87055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.00692: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:01:54.998599", "end": "2024-09-24 15:01:55.005996", "delta": "0:00:00.007397", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204515.01715: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204515.01770: stderr chunk (state=3): >>><<< 46400 1727204515.01774: stdout chunk (state=3): >>><<< 46400 1727204515.01793: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:01:54.998599", "end": "2024-09-24 15:01:55.005996", "delta": "0:00:00.007397", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204515.01821: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204515.01831: _low_level_execute_command(): starting 46400 1727204515.01839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204514.733715-46700-176295042942753/ > /dev/null 2>&1 && sleep 0' 46400 1727204515.02299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.02303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.02337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.02341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.02343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.02401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.02407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.02448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.04323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.04328: stdout chunk (state=3): >>><<< 46400 1727204515.04330: stderr chunk (state=3): >>><<< 46400 1727204515.04360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204515.04366: handler run complete 46400 1727204515.04394: Evaluated conditional (False): False 46400 1727204515.04401: attempt loop complete, returning result 46400 1727204515.04404: _execute() done 46400 1727204515.04406: dumping result to json 46400 1727204515.04412: done dumping result, returning 46400 1727204515.04419: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [0affcd87-79f5-1303-fda8-00000000011b] 46400 1727204515.04423: sending task result for task 0affcd87-79f5-1303-fda8-00000000011b 46400 1727204515.04540: done sending task result for task 0affcd87-79f5-1303-fda8-00000000011b 46400 1727204515.04542: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007397", "end": "2024-09-24 15:01:55.005996", "rc": 1, "start": "2024-09-24 15:01:54.998599" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204515.04621: no more pending results, returning what we have 46400 1727204515.04625: results queue empty 46400 1727204515.04626: checking for any_errors_fatal 46400 1727204515.04627: done checking for any_errors_fatal 46400 1727204515.04628: checking for max_fail_percentage 46400 1727204515.04629: done checking for max_fail_percentage 46400 1727204515.04630: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.04631: done checking to see if all hosts have failed 46400 1727204515.04632: getting the remaining hosts for this loop 46400 1727204515.04633: done getting the remaining hosts for this loop 46400 1727204515.04637: getting the next task for host managed-node2 46400 1727204515.04645: done getting next task for host managed-node2 46400 1727204515.04648: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204515.04652: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.04655: getting variables 46400 1727204515.04656: in VariableManager get_vars() 46400 1727204515.04687: Calling all_inventory to load vars for managed-node2 46400 1727204515.04689: Calling groups_inventory to load vars for managed-node2 46400 1727204515.04693: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.04703: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.04706: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.04708: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.04862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.04989: done with get_vars() 46400 1727204515.04998: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.376) 0:00:05.335 ***** 46400 1727204515.05065: entering _queue_task() for managed-node2/include_tasks 46400 1727204515.05268: worker is 1 (out of 1 available) 46400 1727204515.05279: exiting _queue_task() for managed-node2/include_tasks 46400 1727204515.05291: done queuing things up, now waiting for results queue to drain 46400 1727204515.05293: waiting for pending results... 46400 1727204515.05447: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204515.05514: in run() - task 0affcd87-79f5-1303-fda8-00000000011f 46400 1727204515.05521: variable 'ansible_search_path' from source: unknown 46400 1727204515.05524: variable 'ansible_search_path' from source: unknown 46400 1727204515.05556: calling self._execute() 46400 1727204515.05620: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.05625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.05633: variable 'omit' from source: magic vars 46400 1727204515.05936: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.05953: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.05969: _execute() done 46400 1727204515.05977: dumping result to json 46400 1727204515.05984: done dumping result, returning 46400 1727204515.05993: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-00000000011f] 46400 1727204515.06003: sending task result for task 0affcd87-79f5-1303-fda8-00000000011f 46400 1727204515.06213: no more pending results, returning what we have 46400 1727204515.06218: in VariableManager get_vars() 46400 1727204515.07177: Calling all_inventory to load vars for managed-node2 46400 1727204515.07181: Calling groups_inventory to load vars for managed-node2 46400 1727204515.07188: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.07204: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.07207: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.07210: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.08084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.08251: done with get_vars() 46400 1727204515.08288: variable 'ansible_search_path' from source: unknown 46400 1727204515.08290: variable 'ansible_search_path' from source: unknown 46400 1727204515.08300: variable 'item' from source: include params 46400 1727204515.08491: variable 'item' from source: include params 46400 1727204515.08510: done sending task result for task 0affcd87-79f5-1303-fda8-00000000011f 46400 1727204515.08513: WORKER PROCESS EXITING 46400 1727204515.08539: we have included files to process 46400 1727204515.08540: generating all_blocks data 46400 1727204515.08542: done generating all_blocks data 46400 1727204515.08546: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204515.08547: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204515.08549: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204515.08781: done processing included file 46400 1727204515.08783: iterating over new_blocks loaded from include file 46400 1727204515.08785: in VariableManager get_vars() 46400 1727204515.08833: done with get_vars() 46400 1727204515.08836: filtering new block on tags 46400 1727204515.08869: done filtering new block on tags 46400 1727204515.08872: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204515.08877: extending task lists for all hosts with included blocks 46400 1727204515.09202: done extending task lists 46400 1727204515.09203: done processing included files 46400 1727204515.09204: results queue empty 46400 1727204515.09205: checking for any_errors_fatal 46400 1727204515.09212: done checking for any_errors_fatal 46400 1727204515.09213: checking for max_fail_percentage 46400 1727204515.09214: done checking for max_fail_percentage 46400 1727204515.09215: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.09215: done checking to see if all hosts have failed 46400 1727204515.09216: getting the remaining hosts for this loop 46400 1727204515.09217: done getting the remaining hosts for this loop 46400 1727204515.09220: getting the next task for host managed-node2 46400 1727204515.09224: done getting next task for host managed-node2 46400 1727204515.09227: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204515.09230: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.09232: getting variables 46400 1727204515.09233: in VariableManager get_vars() 46400 1727204515.09243: Calling all_inventory to load vars for managed-node2 46400 1727204515.09245: Calling groups_inventory to load vars for managed-node2 46400 1727204515.09247: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.09253: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.09255: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.09258: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.09400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.10705: done with get_vars() 46400 1727204515.10717: done getting variables 46400 1727204515.10868: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.058) 0:00:05.393 ***** 46400 1727204515.10900: entering _queue_task() for managed-node2/stat 46400 1727204515.11199: worker is 1 (out of 1 available) 46400 1727204515.11213: exiting _queue_task() for managed-node2/stat 46400 1727204515.11226: done queuing things up, now waiting for results queue to drain 46400 1727204515.11228: waiting for pending results... 46400 1727204515.12131: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204515.12266: in run() - task 0affcd87-79f5-1303-fda8-00000000016e 46400 1727204515.12290: variable 'ansible_search_path' from source: unknown 46400 1727204515.12307: variable 'ansible_search_path' from source: unknown 46400 1727204515.12583: calling self._execute() 46400 1727204515.12670: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.12682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.12694: variable 'omit' from source: magic vars 46400 1727204515.13415: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.13487: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.13500: variable 'omit' from source: magic vars 46400 1727204515.13687: variable 'omit' from source: magic vars 46400 1727204515.13915: variable 'interface' from source: play vars 46400 1727204515.13938: variable 'omit' from source: magic vars 46400 1727204515.14043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204515.14208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204515.14236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204515.14387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204515.14535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204515.14571: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204515.14582: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.14591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.14762: Set connection var ansible_shell_type to sh 46400 1727204515.14786: Set connection var ansible_shell_executable to /bin/sh 46400 1727204515.14799: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204515.14832: Set connection var ansible_connection to ssh 46400 1727204515.14847: Set connection var ansible_pipelining to False 46400 1727204515.14868: Set connection var ansible_timeout to 10 46400 1727204515.14900: variable 'ansible_shell_executable' from source: unknown 46400 1727204515.14909: variable 'ansible_connection' from source: unknown 46400 1727204515.14918: variable 'ansible_module_compression' from source: unknown 46400 1727204515.14925: variable 'ansible_shell_type' from source: unknown 46400 1727204515.14932: variable 'ansible_shell_executable' from source: unknown 46400 1727204515.14939: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.14945: variable 'ansible_pipelining' from source: unknown 46400 1727204515.14951: variable 'ansible_timeout' from source: unknown 46400 1727204515.14977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.15198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204515.15216: variable 'omit' from source: magic vars 46400 1727204515.15226: starting attempt loop 46400 1727204515.15233: running the handler 46400 1727204515.15251: _low_level_execute_command(): starting 46400 1727204515.15272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204515.16037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204515.16056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.16081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.16101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.16144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.16158: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204515.16180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.16200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204515.16213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204515.16225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204515.16238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.16252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.16275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.16291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.16302: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204515.16317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.16403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.16427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.16448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.17313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.18182: stdout chunk (state=3): >>>/root <<< 46400 1727204515.18385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.18389: stdout chunk (state=3): >>><<< 46400 1727204515.18392: stderr chunk (state=3): >>><<< 46400 1727204515.18522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204515.18527: _low_level_execute_command(): starting 46400 1727204515.18531: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478 `" && echo ansible-tmp-1727204515.1841507-46772-161984473200478="` echo /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478 `" ) && sleep 0' 46400 1727204515.19146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204515.19162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.19184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.19201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.19244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.19256: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204515.19277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.19296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204515.19307: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204515.19316: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204515.19328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.19341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.19358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.19377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.19391: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204515.19404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.19485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.19510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.19525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.19597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.21488: stdout chunk (state=3): >>>ansible-tmp-1727204515.1841507-46772-161984473200478=/root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478 <<< 46400 1727204515.21615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.21725: stderr chunk (state=3): >>><<< 46400 1727204515.21743: stdout chunk (state=3): >>><<< 46400 1727204515.21976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204515.1841507-46772-161984473200478=/root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204515.21980: variable 'ansible_module_compression' from source: unknown 46400 1727204515.21983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204515.21985: variable 'ansible_facts' from source: unknown 46400 1727204515.22045: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/AnsiballZ_stat.py 46400 1727204515.22671: Sending initial data 46400 1727204515.22674: Sent initial data (153 bytes) 46400 1727204515.23696: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204515.23713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.23731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.23748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.23797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.23809: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204515.23823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.23843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204515.23854: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204515.23868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204515.23880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.23892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.23906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.23917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.23927: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204515.23944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.24025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.24055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.24076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.24151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.25888: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204515.25931: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204515.25967: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmptq03yolp /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/AnsiballZ_stat.py <<< 46400 1727204515.25998: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204515.27120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.27437: stderr chunk (state=3): >>><<< 46400 1727204515.27441: stdout chunk (state=3): >>><<< 46400 1727204515.27444: done transferring module to remote 46400 1727204515.27446: _low_level_execute_command(): starting 46400 1727204515.27453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/ /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/AnsiballZ_stat.py && sleep 0' 46400 1727204515.28157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204515.28181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.28198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.28217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.28279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.28292: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204515.28307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.28325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204515.28339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204515.28357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204515.28375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.28389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.28404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.28415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204515.28425: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204515.28439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.28529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.28552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.28580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.28654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.30486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.30490: stdout chunk (state=3): >>><<< 46400 1727204515.30492: stderr chunk (state=3): >>><<< 46400 1727204515.30597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204515.30600: _low_level_execute_command(): starting 46400 1727204515.30603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/AnsiballZ_stat.py && sleep 0' 46400 1727204515.31407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204515.31411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.31441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204515.31446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.31449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.31520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.31524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.31625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.31710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.44898: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204515.45956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204515.45959: stdout chunk (state=3): >>><<< 46400 1727204515.45962: stderr chunk (state=3): >>><<< 46400 1727204515.46095: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204515.46099: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204515.46106: _low_level_execute_command(): starting 46400 1727204515.46108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204515.1841507-46772-161984473200478/ > /dev/null 2>&1 && sleep 0' 46400 1727204515.47627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.47631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204515.47665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.47670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204515.47673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204515.47855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204515.47913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204515.47917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204515.47994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204515.49758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204515.49834: stderr chunk (state=3): >>><<< 46400 1727204515.49837: stdout chunk (state=3): >>><<< 46400 1727204515.49975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204515.49981: handler run complete 46400 1727204515.49984: attempt loop complete, returning result 46400 1727204515.49986: _execute() done 46400 1727204515.49988: dumping result to json 46400 1727204515.49990: done dumping result, returning 46400 1727204515.49992: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-00000000016e] 46400 1727204515.49994: sending task result for task 0affcd87-79f5-1303-fda8-00000000016e 46400 1727204515.50076: done sending task result for task 0affcd87-79f5-1303-fda8-00000000016e 46400 1727204515.50081: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204515.50139: no more pending results, returning what we have 46400 1727204515.50143: results queue empty 46400 1727204515.50144: checking for any_errors_fatal 46400 1727204515.50145: done checking for any_errors_fatal 46400 1727204515.50145: checking for max_fail_percentage 46400 1727204515.50147: done checking for max_fail_percentage 46400 1727204515.50148: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.50149: done checking to see if all hosts have failed 46400 1727204515.50149: getting the remaining hosts for this loop 46400 1727204515.50151: done getting the remaining hosts for this loop 46400 1727204515.50155: getting the next task for host managed-node2 46400 1727204515.50168: done getting next task for host managed-node2 46400 1727204515.50170: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 46400 1727204515.50175: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.50179: getting variables 46400 1727204515.50180: in VariableManager get_vars() 46400 1727204515.50210: Calling all_inventory to load vars for managed-node2 46400 1727204515.50212: Calling groups_inventory to load vars for managed-node2 46400 1727204515.50215: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.50226: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.50228: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.50231: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.50444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.50656: done with get_vars() 46400 1727204515.50671: done getting variables 46400 1727204515.50776: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 46400 1727204515.51186: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.403) 0:00:05.796 ***** 46400 1727204515.51217: entering _queue_task() for managed-node2/assert 46400 1727204515.51219: Creating lock for assert 46400 1727204515.52032: worker is 1 (out of 1 available) 46400 1727204515.52047: exiting _queue_task() for managed-node2/assert 46400 1727204515.52063: done queuing things up, now waiting for results queue to drain 46400 1727204515.52067: waiting for pending results... 46400 1727204515.53023: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 46400 1727204515.53145: in run() - task 0affcd87-79f5-1303-fda8-000000000120 46400 1727204515.53232: variable 'ansible_search_path' from source: unknown 46400 1727204515.53242: variable 'ansible_search_path' from source: unknown 46400 1727204515.53291: calling self._execute() 46400 1727204515.53406: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.53533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.53555: variable 'omit' from source: magic vars 46400 1727204515.54387: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.54406: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.54425: variable 'omit' from source: magic vars 46400 1727204515.54492: variable 'omit' from source: magic vars 46400 1727204515.54792: variable 'interface' from source: play vars 46400 1727204515.54832: variable 'omit' from source: magic vars 46400 1727204515.55009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204515.55049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204515.55089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204515.55129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204515.55207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204515.55257: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204515.55392: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.56113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.56445: Set connection var ansible_shell_type to sh 46400 1727204515.56465: Set connection var ansible_shell_executable to /bin/sh 46400 1727204515.56552: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204515.56566: Set connection var ansible_connection to ssh 46400 1727204515.56577: Set connection var ansible_pipelining to False 46400 1727204515.56586: Set connection var ansible_timeout to 10 46400 1727204515.56618: variable 'ansible_shell_executable' from source: unknown 46400 1727204515.56659: variable 'ansible_connection' from source: unknown 46400 1727204515.56671: variable 'ansible_module_compression' from source: unknown 46400 1727204515.56734: variable 'ansible_shell_type' from source: unknown 46400 1727204515.56741: variable 'ansible_shell_executable' from source: unknown 46400 1727204515.56747: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.56772: variable 'ansible_pipelining' from source: unknown 46400 1727204515.56876: variable 'ansible_timeout' from source: unknown 46400 1727204515.56884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.57259: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204515.57432: variable 'omit' from source: magic vars 46400 1727204515.57442: starting attempt loop 46400 1727204515.57449: running the handler 46400 1727204515.57738: variable 'interface_stat' from source: set_fact 46400 1727204515.57782: Evaluated conditional (not interface_stat.stat.exists): True 46400 1727204515.57793: handler run complete 46400 1727204515.57810: attempt loop complete, returning result 46400 1727204515.57858: _execute() done 46400 1727204515.57872: dumping result to json 46400 1727204515.57879: done dumping result, returning 46400 1727204515.57890: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [0affcd87-79f5-1303-fda8-000000000120] 46400 1727204515.57900: sending task result for task 0affcd87-79f5-1303-fda8-000000000120 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204515.58111: no more pending results, returning what we have 46400 1727204515.58115: results queue empty 46400 1727204515.58116: checking for any_errors_fatal 46400 1727204515.58125: done checking for any_errors_fatal 46400 1727204515.58126: checking for max_fail_percentage 46400 1727204515.58128: done checking for max_fail_percentage 46400 1727204515.58129: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.58130: done checking to see if all hosts have failed 46400 1727204515.58130: getting the remaining hosts for this loop 46400 1727204515.58132: done getting the remaining hosts for this loop 46400 1727204515.58136: getting the next task for host managed-node2 46400 1727204515.58145: done getting next task for host managed-node2 46400 1727204515.58149: ^ task is: TASK: Test 46400 1727204515.58152: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.58156: getting variables 46400 1727204515.58158: in VariableManager get_vars() 46400 1727204515.58192: Calling all_inventory to load vars for managed-node2 46400 1727204515.58195: Calling groups_inventory to load vars for managed-node2 46400 1727204515.58199: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.58211: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.58213: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.58216: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.58405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.58612: done with get_vars() 46400 1727204515.58624: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.075) 0:00:05.871 ***** 46400 1727204515.58732: entering _queue_task() for managed-node2/include_tasks 46400 1727204515.58749: done sending task result for task 0affcd87-79f5-1303-fda8-000000000120 46400 1727204515.58759: WORKER PROCESS EXITING 46400 1727204515.59730: worker is 1 (out of 1 available) 46400 1727204515.59743: exiting _queue_task() for managed-node2/include_tasks 46400 1727204515.59756: done queuing things up, now waiting for results queue to drain 46400 1727204515.59758: waiting for pending results... 46400 1727204515.60538: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204515.60658: in run() - task 0affcd87-79f5-1303-fda8-000000000095 46400 1727204515.60788: variable 'ansible_search_path' from source: unknown 46400 1727204515.60877: variable 'ansible_search_path' from source: unknown 46400 1727204515.60932: variable 'lsr_test' from source: include params 46400 1727204515.61375: variable 'lsr_test' from source: include params 46400 1727204515.61555: variable 'omit' from source: magic vars 46400 1727204515.61705: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.61773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.61883: variable 'omit' from source: magic vars 46400 1727204515.62469: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.62486: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.62497: variable 'item' from source: unknown 46400 1727204515.62675: variable 'item' from source: unknown 46400 1727204515.62706: variable 'item' from source: unknown 46400 1727204515.62877: variable 'item' from source: unknown 46400 1727204515.63091: dumping result to json 46400 1727204515.63101: done dumping result, returning 46400 1727204515.63111: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-000000000095] 46400 1727204515.63123: sending task result for task 0affcd87-79f5-1303-fda8-000000000095 46400 1727204515.63202: done sending task result for task 0affcd87-79f5-1303-fda8-000000000095 46400 1727204515.63210: WORKER PROCESS EXITING 46400 1727204515.63256: no more pending results, returning what we have 46400 1727204515.63262: in VariableManager get_vars() 46400 1727204515.63304: Calling all_inventory to load vars for managed-node2 46400 1727204515.63309: Calling groups_inventory to load vars for managed-node2 46400 1727204515.63313: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.63327: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.63330: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.63334: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.63583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.63790: done with get_vars() 46400 1727204515.63803: variable 'ansible_search_path' from source: unknown 46400 1727204515.63804: variable 'ansible_search_path' from source: unknown 46400 1727204515.63845: we have included files to process 46400 1727204515.63846: generating all_blocks data 46400 1727204515.63848: done generating all_blocks data 46400 1727204515.63853: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204515.63854: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204515.63856: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204515.64877: done processing included file 46400 1727204515.64879: iterating over new_blocks loaded from include file 46400 1727204515.64881: in VariableManager get_vars() 46400 1727204515.65122: done with get_vars() 46400 1727204515.65125: filtering new block on tags 46400 1727204515.65163: done filtering new block on tags 46400 1727204515.65168: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 46400 1727204515.65174: extending task lists for all hosts with included blocks 46400 1727204515.67793: done extending task lists 46400 1727204515.67795: done processing included files 46400 1727204515.67796: results queue empty 46400 1727204515.67797: checking for any_errors_fatal 46400 1727204515.67801: done checking for any_errors_fatal 46400 1727204515.67802: checking for max_fail_percentage 46400 1727204515.67803: done checking for max_fail_percentage 46400 1727204515.67804: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.67804: done checking to see if all hosts have failed 46400 1727204515.67805: getting the remaining hosts for this loop 46400 1727204515.67807: done getting the remaining hosts for this loop 46400 1727204515.67809: getting the next task for host managed-node2 46400 1727204515.67814: done getting next task for host managed-node2 46400 1727204515.67816: ^ task is: TASK: Include network role 46400 1727204515.67819: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.67821: getting variables 46400 1727204515.67822: in VariableManager get_vars() 46400 1727204515.67950: Calling all_inventory to load vars for managed-node2 46400 1727204515.67954: Calling groups_inventory to load vars for managed-node2 46400 1727204515.67957: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.67963: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.67968: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.67972: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.68390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.68698: done with get_vars() 46400 1727204515.68836: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.101) 0:00:05.973 ***** 46400 1727204515.69036: entering _queue_task() for managed-node2/include_role 46400 1727204515.69038: Creating lock for include_role 46400 1727204515.69735: worker is 1 (out of 1 available) 46400 1727204515.69748: exiting _queue_task() for managed-node2/include_role 46400 1727204515.69762: done queuing things up, now waiting for results queue to drain 46400 1727204515.69766: waiting for pending results... 46400 1727204515.71744: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204515.72054: in run() - task 0affcd87-79f5-1303-fda8-00000000018e 46400 1727204515.72077: variable 'ansible_search_path' from source: unknown 46400 1727204515.72087: variable 'ansible_search_path' from source: unknown 46400 1727204515.72129: calling self._execute() 46400 1727204515.72336: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.72347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.72359: variable 'omit' from source: magic vars 46400 1727204515.73135: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.73246: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.73257: _execute() done 46400 1727204515.73266: dumping result to json 46400 1727204515.73274: done dumping result, returning 46400 1727204515.73283: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-00000000018e] 46400 1727204515.73292: sending task result for task 0affcd87-79f5-1303-fda8-00000000018e 46400 1727204515.73440: no more pending results, returning what we have 46400 1727204515.73445: in VariableManager get_vars() 46400 1727204515.73482: Calling all_inventory to load vars for managed-node2 46400 1727204515.73486: Calling groups_inventory to load vars for managed-node2 46400 1727204515.73489: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.73503: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.73506: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.73509: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.73706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.73902: done with get_vars() 46400 1727204515.73911: variable 'ansible_search_path' from source: unknown 46400 1727204515.73912: variable 'ansible_search_path' from source: unknown 46400 1727204515.74514: variable 'omit' from source: magic vars 46400 1727204515.74558: variable 'omit' from source: magic vars 46400 1727204515.74727: done sending task result for task 0affcd87-79f5-1303-fda8-00000000018e 46400 1727204515.74730: WORKER PROCESS EXITING 46400 1727204515.74576: variable 'omit' from source: magic vars 46400 1727204515.74734: we have included files to process 46400 1727204515.74735: generating all_blocks data 46400 1727204515.74737: done generating all_blocks data 46400 1727204515.74738: processing included file: fedora.linux_system_roles.network 46400 1727204515.74758: in VariableManager get_vars() 46400 1727204515.74773: done with get_vars() 46400 1727204515.74833: in VariableManager get_vars() 46400 1727204515.74849: done with get_vars() 46400 1727204515.74892: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204515.75553: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204515.75895: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204515.77408: in VariableManager get_vars() 46400 1727204515.77433: done with get_vars() 46400 1727204515.78352: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204515.80869: iterating over new_blocks loaded from include file 46400 1727204515.80872: in VariableManager get_vars() 46400 1727204515.80893: done with get_vars() 46400 1727204515.80895: filtering new block on tags 46400 1727204515.81199: done filtering new block on tags 46400 1727204515.81203: in VariableManager get_vars() 46400 1727204515.81219: done with get_vars() 46400 1727204515.81220: filtering new block on tags 46400 1727204515.81242: done filtering new block on tags 46400 1727204515.81245: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204515.81251: extending task lists for all hosts with included blocks 46400 1727204515.81422: done extending task lists 46400 1727204515.81423: done processing included files 46400 1727204515.81424: results queue empty 46400 1727204515.81425: checking for any_errors_fatal 46400 1727204515.81429: done checking for any_errors_fatal 46400 1727204515.81429: checking for max_fail_percentage 46400 1727204515.81431: done checking for max_fail_percentage 46400 1727204515.81431: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.81432: done checking to see if all hosts have failed 46400 1727204515.81433: getting the remaining hosts for this loop 46400 1727204515.81434: done getting the remaining hosts for this loop 46400 1727204515.81437: getting the next task for host managed-node2 46400 1727204515.81441: done getting next task for host managed-node2 46400 1727204515.81444: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204515.81447: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.81461: getting variables 46400 1727204515.81463: in VariableManager get_vars() 46400 1727204515.81478: Calling all_inventory to load vars for managed-node2 46400 1727204515.81481: Calling groups_inventory to load vars for managed-node2 46400 1727204515.81483: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.81489: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.81492: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.81494: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.81714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.81946: done with get_vars() 46400 1727204515.81956: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.130) 0:00:06.105 ***** 46400 1727204515.82117: entering _queue_task() for managed-node2/include_tasks 46400 1727204515.82441: worker is 1 (out of 1 available) 46400 1727204515.82457: exiting _queue_task() for managed-node2/include_tasks 46400 1727204515.82471: done queuing things up, now waiting for results queue to drain 46400 1727204515.82473: waiting for pending results... 46400 1727204515.83324: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204515.84003: in run() - task 0affcd87-79f5-1303-fda8-00000000020c 46400 1727204515.84021: variable 'ansible_search_path' from source: unknown 46400 1727204515.84028: variable 'ansible_search_path' from source: unknown 46400 1727204515.84077: calling self._execute() 46400 1727204515.84167: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.84679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.84694: variable 'omit' from source: magic vars 46400 1727204515.85047: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.85071: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.85084: _execute() done 46400 1727204515.85093: dumping result to json 46400 1727204515.85100: done dumping result, returning 46400 1727204515.85129: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-00000000020c] 46400 1727204515.85143: sending task result for task 0affcd87-79f5-1303-fda8-00000000020c 46400 1727204515.85297: no more pending results, returning what we have 46400 1727204515.85303: in VariableManager get_vars() 46400 1727204515.85351: Calling all_inventory to load vars for managed-node2 46400 1727204515.85355: Calling groups_inventory to load vars for managed-node2 46400 1727204515.85358: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.85373: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.85376: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.85380: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.85542: done sending task result for task 0affcd87-79f5-1303-fda8-00000000020c 46400 1727204515.85546: WORKER PROCESS EXITING 46400 1727204515.85565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.85779: done with get_vars() 46400 1727204515.85787: variable 'ansible_search_path' from source: unknown 46400 1727204515.85788: variable 'ansible_search_path' from source: unknown 46400 1727204515.85828: we have included files to process 46400 1727204515.85829: generating all_blocks data 46400 1727204515.85831: done generating all_blocks data 46400 1727204515.85835: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204515.85836: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204515.85838: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204515.87472: done processing included file 46400 1727204515.87474: iterating over new_blocks loaded from include file 46400 1727204515.87476: in VariableManager get_vars() 46400 1727204515.87618: done with get_vars() 46400 1727204515.87620: filtering new block on tags 46400 1727204515.87655: done filtering new block on tags 46400 1727204515.87658: in VariableManager get_vars() 46400 1727204515.87682: done with get_vars() 46400 1727204515.87684: filtering new block on tags 46400 1727204515.87974: done filtering new block on tags 46400 1727204515.87977: in VariableManager get_vars() 46400 1727204515.87999: done with get_vars() 46400 1727204515.88001: filtering new block on tags 46400 1727204515.88045: done filtering new block on tags 46400 1727204515.88047: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204515.88053: extending task lists for all hosts with included blocks 46400 1727204515.93707: done extending task lists 46400 1727204515.93709: done processing included files 46400 1727204515.93709: results queue empty 46400 1727204515.93710: checking for any_errors_fatal 46400 1727204515.93713: done checking for any_errors_fatal 46400 1727204515.93714: checking for max_fail_percentage 46400 1727204515.93715: done checking for max_fail_percentage 46400 1727204515.93716: checking to see if all hosts have failed and the running result is not ok 46400 1727204515.93717: done checking to see if all hosts have failed 46400 1727204515.93718: getting the remaining hosts for this loop 46400 1727204515.93719: done getting the remaining hosts for this loop 46400 1727204515.93722: getting the next task for host managed-node2 46400 1727204515.93727: done getting next task for host managed-node2 46400 1727204515.93730: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204515.93735: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204515.93744: getting variables 46400 1727204515.93746: in VariableManager get_vars() 46400 1727204515.93762: Calling all_inventory to load vars for managed-node2 46400 1727204515.93767: Calling groups_inventory to load vars for managed-node2 46400 1727204515.93769: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204515.93775: Calling all_plugins_play to load vars for managed-node2 46400 1727204515.93777: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204515.93780: Calling groups_plugins_play to load vars for managed-node2 46400 1727204515.93932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204515.94145: done with get_vars() 46400 1727204515.94157: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.121) 0:00:06.226 ***** 46400 1727204515.94241: entering _queue_task() for managed-node2/setup 46400 1727204515.95242: worker is 1 (out of 1 available) 46400 1727204515.95255: exiting _queue_task() for managed-node2/setup 46400 1727204515.95268: done queuing things up, now waiting for results queue to drain 46400 1727204515.95269: waiting for pending results... 46400 1727204515.96824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204515.97412: in run() - task 0affcd87-79f5-1303-fda8-000000000269 46400 1727204515.97433: variable 'ansible_search_path' from source: unknown 46400 1727204515.97442: variable 'ansible_search_path' from source: unknown 46400 1727204515.97491: calling self._execute() 46400 1727204515.97784: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204515.97797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204515.97815: variable 'omit' from source: magic vars 46400 1727204515.98377: variable 'ansible_distribution_major_version' from source: facts 46400 1727204515.98405: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204515.98670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204516.01139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204516.01236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204516.01289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204516.01329: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204516.01364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204516.01471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204516.01511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204516.01569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204516.01620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204516.01655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204516.01718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204516.01746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204516.01790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204516.01847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204516.01871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204516.02072: variable '__network_required_facts' from source: role '' defaults 46400 1727204516.02090: variable 'ansible_facts' from source: unknown 46400 1727204516.02221: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204516.02229: when evaluation is False, skipping this task 46400 1727204516.02236: _execute() done 46400 1727204516.02248: dumping result to json 46400 1727204516.02255: done dumping result, returning 46400 1727204516.02272: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000000269] 46400 1727204516.02282: sending task result for task 0affcd87-79f5-1303-fda8-000000000269 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204516.02446: no more pending results, returning what we have 46400 1727204516.02451: results queue empty 46400 1727204516.02452: checking for any_errors_fatal 46400 1727204516.02454: done checking for any_errors_fatal 46400 1727204516.02455: checking for max_fail_percentage 46400 1727204516.02456: done checking for max_fail_percentage 46400 1727204516.02458: checking to see if all hosts have failed and the running result is not ok 46400 1727204516.02459: done checking to see if all hosts have failed 46400 1727204516.02459: getting the remaining hosts for this loop 46400 1727204516.02465: done getting the remaining hosts for this loop 46400 1727204516.02469: getting the next task for host managed-node2 46400 1727204516.02482: done getting next task for host managed-node2 46400 1727204516.02487: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204516.02493: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204516.02507: getting variables 46400 1727204516.02509: in VariableManager get_vars() 46400 1727204516.02548: Calling all_inventory to load vars for managed-node2 46400 1727204516.02551: Calling groups_inventory to load vars for managed-node2 46400 1727204516.02554: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204516.02570: Calling all_plugins_play to load vars for managed-node2 46400 1727204516.02573: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204516.02577: Calling groups_plugins_play to load vars for managed-node2 46400 1727204516.03325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204516.03704: done with get_vars() 46400 1727204516.03716: done getting variables 46400 1727204516.04062: done sending task result for task 0affcd87-79f5-1303-fda8-000000000269 46400 1727204516.04069: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.099) 0:00:06.326 ***** 46400 1727204516.04151: entering _queue_task() for managed-node2/stat 46400 1727204516.04885: worker is 1 (out of 1 available) 46400 1727204516.04896: exiting _queue_task() for managed-node2/stat 46400 1727204516.04909: done queuing things up, now waiting for results queue to drain 46400 1727204516.04911: waiting for pending results... 46400 1727204516.06038: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204516.06311: in run() - task 0affcd87-79f5-1303-fda8-00000000026b 46400 1727204516.06447: variable 'ansible_search_path' from source: unknown 46400 1727204516.06459: variable 'ansible_search_path' from source: unknown 46400 1727204516.06506: calling self._execute() 46400 1727204516.06698: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204516.06711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204516.06726: variable 'omit' from source: magic vars 46400 1727204516.07283: variable 'ansible_distribution_major_version' from source: facts 46400 1727204516.07299: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204516.07531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204516.07826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204516.07918: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204516.07967: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204516.08012: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204516.08115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204516.08149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204516.08191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204516.08227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204516.08331: variable '__network_is_ostree' from source: set_fact 46400 1727204516.08342: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204516.08349: when evaluation is False, skipping this task 46400 1727204516.08355: _execute() done 46400 1727204516.08365: dumping result to json 46400 1727204516.08375: done dumping result, returning 46400 1727204516.08386: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-00000000026b] 46400 1727204516.08401: sending task result for task 0affcd87-79f5-1303-fda8-00000000026b skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204516.08559: no more pending results, returning what we have 46400 1727204516.08572: results queue empty 46400 1727204516.08573: checking for any_errors_fatal 46400 1727204516.08582: done checking for any_errors_fatal 46400 1727204516.08583: checking for max_fail_percentage 46400 1727204516.08585: done checking for max_fail_percentage 46400 1727204516.08586: checking to see if all hosts have failed and the running result is not ok 46400 1727204516.08587: done checking to see if all hosts have failed 46400 1727204516.08588: getting the remaining hosts for this loop 46400 1727204516.08589: done getting the remaining hosts for this loop 46400 1727204516.08594: getting the next task for host managed-node2 46400 1727204516.08603: done getting next task for host managed-node2 46400 1727204516.08608: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204516.08613: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204516.08628: getting variables 46400 1727204516.08630: in VariableManager get_vars() 46400 1727204516.08672: Calling all_inventory to load vars for managed-node2 46400 1727204516.08677: Calling groups_inventory to load vars for managed-node2 46400 1727204516.08680: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204516.08691: Calling all_plugins_play to load vars for managed-node2 46400 1727204516.08694: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204516.08697: Calling groups_plugins_play to load vars for managed-node2 46400 1727204516.08897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204516.09127: done with get_vars() 46400 1727204516.09180: done getting variables 46400 1727204516.09319: done sending task result for task 0affcd87-79f5-1303-fda8-00000000026b 46400 1727204516.09322: WORKER PROCESS EXITING 46400 1727204516.09416: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.053) 0:00:06.379 ***** 46400 1727204516.09512: entering _queue_task() for managed-node2/set_fact 46400 1727204516.10026: worker is 1 (out of 1 available) 46400 1727204516.10040: exiting _queue_task() for managed-node2/set_fact 46400 1727204516.10053: done queuing things up, now waiting for results queue to drain 46400 1727204516.10055: waiting for pending results... 46400 1727204516.10334: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204516.10506: in run() - task 0affcd87-79f5-1303-fda8-00000000026c 46400 1727204516.10531: variable 'ansible_search_path' from source: unknown 46400 1727204516.10540: variable 'ansible_search_path' from source: unknown 46400 1727204516.10586: calling self._execute() 46400 1727204516.10678: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204516.10691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204516.10705: variable 'omit' from source: magic vars 46400 1727204516.11206: variable 'ansible_distribution_major_version' from source: facts 46400 1727204516.11223: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204516.11483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204516.12015: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204516.12123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204516.12174: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204516.12218: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204516.12317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204516.12349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204516.12386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204516.12426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204516.12529: variable '__network_is_ostree' from source: set_fact 46400 1727204516.12603: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204516.12614: when evaluation is False, skipping this task 46400 1727204516.12626: _execute() done 46400 1727204516.12634: dumping result to json 46400 1727204516.12641: done dumping result, returning 46400 1727204516.12653: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-00000000026c] 46400 1727204516.12669: sending task result for task 0affcd87-79f5-1303-fda8-00000000026c skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204516.12821: no more pending results, returning what we have 46400 1727204516.12826: results queue empty 46400 1727204516.12827: checking for any_errors_fatal 46400 1727204516.12832: done checking for any_errors_fatal 46400 1727204516.12833: checking for max_fail_percentage 46400 1727204516.12835: done checking for max_fail_percentage 46400 1727204516.12836: checking to see if all hosts have failed and the running result is not ok 46400 1727204516.12837: done checking to see if all hosts have failed 46400 1727204516.12838: getting the remaining hosts for this loop 46400 1727204516.12840: done getting the remaining hosts for this loop 46400 1727204516.12845: getting the next task for host managed-node2 46400 1727204516.12857: done getting next task for host managed-node2 46400 1727204516.12866: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204516.12871: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204516.12886: getting variables 46400 1727204516.12888: in VariableManager get_vars() 46400 1727204516.12928: Calling all_inventory to load vars for managed-node2 46400 1727204516.12931: Calling groups_inventory to load vars for managed-node2 46400 1727204516.12933: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204516.12944: Calling all_plugins_play to load vars for managed-node2 46400 1727204516.12947: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204516.12951: Calling groups_plugins_play to load vars for managed-node2 46400 1727204516.13188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204516.13416: done with get_vars() 46400 1727204516.13429: done getting variables 46400 1727204516.13718: done sending task result for task 0affcd87-79f5-1303-fda8-00000000026c 46400 1727204516.13721: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.042) 0:00:06.422 ***** 46400 1727204516.13777: entering _queue_task() for managed-node2/service_facts 46400 1727204516.13779: Creating lock for service_facts 46400 1727204516.14658: worker is 1 (out of 1 available) 46400 1727204516.14677: exiting _queue_task() for managed-node2/service_facts 46400 1727204516.14694: done queuing things up, now waiting for results queue to drain 46400 1727204516.14696: waiting for pending results... 46400 1727204516.15295: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204516.15677: in run() - task 0affcd87-79f5-1303-fda8-00000000026e 46400 1727204516.15699: variable 'ansible_search_path' from source: unknown 46400 1727204516.15707: variable 'ansible_search_path' from source: unknown 46400 1727204516.15745: calling self._execute() 46400 1727204516.15854: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204516.15954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204516.16043: variable 'omit' from source: magic vars 46400 1727204516.16874: variable 'ansible_distribution_major_version' from source: facts 46400 1727204516.16891: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204516.16902: variable 'omit' from source: magic vars 46400 1727204516.16996: variable 'omit' from source: magic vars 46400 1727204516.17043: variable 'omit' from source: magic vars 46400 1727204516.17098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204516.17143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204516.17178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204516.17201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204516.17218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204516.17263: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204516.17276: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204516.17285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204516.17392: Set connection var ansible_shell_type to sh 46400 1727204516.17407: Set connection var ansible_shell_executable to /bin/sh 46400 1727204516.17418: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204516.17428: Set connection var ansible_connection to ssh 46400 1727204516.17437: Set connection var ansible_pipelining to False 46400 1727204516.17446: Set connection var ansible_timeout to 10 46400 1727204516.17485: variable 'ansible_shell_executable' from source: unknown 46400 1727204516.17493: variable 'ansible_connection' from source: unknown 46400 1727204516.17500: variable 'ansible_module_compression' from source: unknown 46400 1727204516.17506: variable 'ansible_shell_type' from source: unknown 46400 1727204516.17512: variable 'ansible_shell_executable' from source: unknown 46400 1727204516.17518: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204516.17525: variable 'ansible_pipelining' from source: unknown 46400 1727204516.17531: variable 'ansible_timeout' from source: unknown 46400 1727204516.17540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204516.17756: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204516.17777: variable 'omit' from source: magic vars 46400 1727204516.17786: starting attempt loop 46400 1727204516.17794: running the handler 46400 1727204516.17813: _low_level_execute_command(): starting 46400 1727204516.17824: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204516.19991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204516.20007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.20023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.20047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.20098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.20109: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204516.20122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.20139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204516.20152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204516.20169: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204516.20181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.20197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.20215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.20228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.20240: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204516.20252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.20336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204516.20354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204516.20375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204516.20612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204516.22200: stdout chunk (state=3): >>>/root <<< 46400 1727204516.22409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204516.22413: stdout chunk (state=3): >>><<< 46400 1727204516.22415: stderr chunk (state=3): >>><<< 46400 1727204516.22544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204516.22548: _low_level_execute_command(): starting 46400 1727204516.22552: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762 `" && echo ansible-tmp-1727204516.2243679-46921-268799533744762="` echo /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762 `" ) && sleep 0' 46400 1727204516.23343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204516.23368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.23385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.23404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.23447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.23462: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204516.23483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.23501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204516.23512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204516.23525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204516.23672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.24226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.24243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.24254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.24270: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204516.24287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.24479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204516.24496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204516.24511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204516.24657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204516.26454: stdout chunk (state=3): >>>ansible-tmp-1727204516.2243679-46921-268799533744762=/root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762 <<< 46400 1727204516.26653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204516.26657: stdout chunk (state=3): >>><<< 46400 1727204516.26662: stderr chunk (state=3): >>><<< 46400 1727204516.26972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204516.2243679-46921-268799533744762=/root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204516.26976: variable 'ansible_module_compression' from source: unknown 46400 1727204516.26978: ANSIBALLZ: Using lock for service_facts 46400 1727204516.26981: ANSIBALLZ: Acquiring lock 46400 1727204516.26983: ANSIBALLZ: Lock acquired: 140519370372784 46400 1727204516.26985: ANSIBALLZ: Creating module 46400 1727204516.44019: ANSIBALLZ: Writing module into payload 46400 1727204516.44154: ANSIBALLZ: Writing module 46400 1727204516.44192: ANSIBALLZ: Renaming module 46400 1727204516.44208: ANSIBALLZ: Done creating module 46400 1727204516.44228: variable 'ansible_facts' from source: unknown 46400 1727204516.44311: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/AnsiballZ_service_facts.py 46400 1727204516.44470: Sending initial data 46400 1727204516.44481: Sent initial data (162 bytes) 46400 1727204516.45489: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204516.45503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.45519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.45539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.45589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.45605: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204516.45618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.45634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204516.45647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204516.45657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204516.45674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.45688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.45706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.45719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.45729: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204516.45741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.45822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204516.45838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204516.45851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204516.46051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204516.47821: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204516.47857: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204516.47901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpki3hu5_e /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/AnsiballZ_service_facts.py <<< 46400 1727204516.47934: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204516.49345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204516.49370: stderr chunk (state=3): >>><<< 46400 1727204516.49373: stdout chunk (state=3): >>><<< 46400 1727204516.49478: done transferring module to remote 46400 1727204516.49481: _low_level_execute_command(): starting 46400 1727204516.49483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/ /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/AnsiballZ_service_facts.py && sleep 0' 46400 1727204516.50949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204516.51013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.51082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.51103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.51152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.51170: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204516.51185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.51203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204516.51228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204516.51243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204516.51257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.51282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.51302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.51339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.51353: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204516.51376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.51570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204516.51588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204516.51603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204516.51779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204516.53483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204516.53570: stderr chunk (state=3): >>><<< 46400 1727204516.53573: stdout chunk (state=3): >>><<< 46400 1727204516.53671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204516.53675: _low_level_execute_command(): starting 46400 1727204516.53678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/AnsiballZ_service_facts.py && sleep 0' 46400 1727204516.55329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204516.55422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.55437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.55455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.55503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.55518: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204516.55531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.55547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204516.55558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204516.55574: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204516.55637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204516.55650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204516.55668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204516.55679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204516.55688: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204516.55698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204516.55772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204516.55857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204516.55878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204516.56076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204517.87878: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 46400 1727204517.87888: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 46400 1727204517.87891: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 46400 1727204517.87894: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 46400 1727204517.87924: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 46400 1727204517.87941: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204517.89314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204517.89344: stderr chunk (state=3): >>><<< 46400 1727204517.89347: stdout chunk (state=3): >>><<< 46400 1727204517.89577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204517.90107: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204517.90117: _low_level_execute_command(): starting 46400 1727204517.90120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204516.2243679-46921-268799533744762/ > /dev/null 2>&1 && sleep 0' 46400 1727204517.90569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204517.90593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204517.90606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204517.90616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204517.90657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204517.90674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204517.90721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204517.92544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204517.92590: stderr chunk (state=3): >>><<< 46400 1727204517.92594: stdout chunk (state=3): >>><<< 46400 1727204517.92605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204517.92612: handler run complete 46400 1727204517.92715: variable 'ansible_facts' from source: unknown 46400 1727204517.92799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204517.93081: variable 'ansible_facts' from source: unknown 46400 1727204517.93150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204517.93256: attempt loop complete, returning result 46400 1727204517.93259: _execute() done 46400 1727204517.93268: dumping result to json 46400 1727204517.93301: done dumping result, returning 46400 1727204517.93309: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-00000000026e] 46400 1727204517.93314: sending task result for task 0affcd87-79f5-1303-fda8-00000000026e ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204517.93833: no more pending results, returning what we have 46400 1727204517.93836: results queue empty 46400 1727204517.93837: checking for any_errors_fatal 46400 1727204517.93841: done checking for any_errors_fatal 46400 1727204517.93842: checking for max_fail_percentage 46400 1727204517.93843: done checking for max_fail_percentage 46400 1727204517.93844: checking to see if all hosts have failed and the running result is not ok 46400 1727204517.93845: done checking to see if all hosts have failed 46400 1727204517.93845: getting the remaining hosts for this loop 46400 1727204517.93847: done getting the remaining hosts for this loop 46400 1727204517.93850: getting the next task for host managed-node2 46400 1727204517.93856: done getting next task for host managed-node2 46400 1727204517.93860: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204517.93866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204517.93877: getting variables 46400 1727204517.93879: in VariableManager get_vars() 46400 1727204517.93905: Calling all_inventory to load vars for managed-node2 46400 1727204517.93907: Calling groups_inventory to load vars for managed-node2 46400 1727204517.93909: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204517.93917: Calling all_plugins_play to load vars for managed-node2 46400 1727204517.93918: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204517.93921: Calling groups_plugins_play to load vars for managed-node2 46400 1727204517.94137: done sending task result for task 0affcd87-79f5-1303-fda8-00000000026e 46400 1727204517.94140: WORKER PROCESS EXITING 46400 1727204517.94149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204517.94417: done with get_vars() 46400 1727204517.94427: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:57 -0400 (0:00:01.807) 0:00:08.229 ***** 46400 1727204517.94499: entering _queue_task() for managed-node2/package_facts 46400 1727204517.94500: Creating lock for package_facts 46400 1727204517.94746: worker is 1 (out of 1 available) 46400 1727204517.94768: exiting _queue_task() for managed-node2/package_facts 46400 1727204517.94781: done queuing things up, now waiting for results queue to drain 46400 1727204517.94783: waiting for pending results... 46400 1727204517.95067: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204517.95237: in run() - task 0affcd87-79f5-1303-fda8-00000000026f 46400 1727204517.95257: variable 'ansible_search_path' from source: unknown 46400 1727204517.95270: variable 'ansible_search_path' from source: unknown 46400 1727204517.95314: calling self._execute() 46400 1727204517.95405: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204517.95426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204517.95446: variable 'omit' from source: magic vars 46400 1727204517.95863: variable 'ansible_distribution_major_version' from source: facts 46400 1727204517.95874: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204517.95878: variable 'omit' from source: magic vars 46400 1727204517.95952: variable 'omit' from source: magic vars 46400 1727204517.95984: variable 'omit' from source: magic vars 46400 1727204517.96017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204517.96049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204517.96070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204517.96087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204517.96097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204517.96119: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204517.96122: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204517.96124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204517.96191: Set connection var ansible_shell_type to sh 46400 1727204517.96200: Set connection var ansible_shell_executable to /bin/sh 46400 1727204517.96206: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204517.96210: Set connection var ansible_connection to ssh 46400 1727204517.96215: Set connection var ansible_pipelining to False 46400 1727204517.96220: Set connection var ansible_timeout to 10 46400 1727204517.96238: variable 'ansible_shell_executable' from source: unknown 46400 1727204517.96241: variable 'ansible_connection' from source: unknown 46400 1727204517.96244: variable 'ansible_module_compression' from source: unknown 46400 1727204517.96246: variable 'ansible_shell_type' from source: unknown 46400 1727204517.96248: variable 'ansible_shell_executable' from source: unknown 46400 1727204517.96250: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204517.96252: variable 'ansible_pipelining' from source: unknown 46400 1727204517.96255: variable 'ansible_timeout' from source: unknown 46400 1727204517.96259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204517.96409: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204517.96413: variable 'omit' from source: magic vars 46400 1727204517.96416: starting attempt loop 46400 1727204517.96418: running the handler 46400 1727204517.96432: _low_level_execute_command(): starting 46400 1727204517.96437: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204517.96936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204517.96945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204517.96979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204517.96987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204517.96998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204517.97008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204517.97050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204517.97068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204517.97119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204517.98784: stdout chunk (state=3): >>>/root <<< 46400 1727204517.98938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204517.98973: stderr chunk (state=3): >>><<< 46400 1727204517.98982: stdout chunk (state=3): >>><<< 46400 1727204517.99007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204517.99026: _low_level_execute_command(): starting 46400 1727204517.99042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660 `" && echo ansible-tmp-1727204517.9901361-47059-88209660813660="` echo /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660 `" ) && sleep 0' 46400 1727204517.99705: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204517.99718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204517.99731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204517.99747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204517.99793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204517.99817: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204517.99830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204517.99847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204517.99858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204517.99870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204517.99882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204517.99893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204517.99908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204517.99925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204517.99935: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204517.99947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.00029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204518.00052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204518.00068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204518.00143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204518.02067: stdout chunk (state=3): >>>ansible-tmp-1727204517.9901361-47059-88209660813660=/root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660 <<< 46400 1727204518.02268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204518.02272: stdout chunk (state=3): >>><<< 46400 1727204518.02274: stderr chunk (state=3): >>><<< 46400 1727204518.02770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204517.9901361-47059-88209660813660=/root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204518.02774: variable 'ansible_module_compression' from source: unknown 46400 1727204518.02776: ANSIBALLZ: Using lock for package_facts 46400 1727204518.02778: ANSIBALLZ: Acquiring lock 46400 1727204518.02780: ANSIBALLZ: Lock acquired: 140519370663808 46400 1727204518.02782: ANSIBALLZ: Creating module 46400 1727204518.44379: ANSIBALLZ: Writing module into payload 46400 1727204518.44553: ANSIBALLZ: Writing module 46400 1727204518.44596: ANSIBALLZ: Renaming module 46400 1727204518.44608: ANSIBALLZ: Done creating module 46400 1727204518.44632: variable 'ansible_facts' from source: unknown 46400 1727204518.44804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/AnsiballZ_package_facts.py 46400 1727204518.44961: Sending initial data 46400 1727204518.44966: Sent initial data (161 bytes) 46400 1727204518.46010: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204518.46013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204518.46050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.46053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204518.46055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.46127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204518.46139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204518.46206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204518.48034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204518.48074: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204518.48110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpfzbnu_uu /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/AnsiballZ_package_facts.py <<< 46400 1727204518.48135: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204518.50689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204518.50870: stderr chunk (state=3): >>><<< 46400 1727204518.50966: stdout chunk (state=3): >>><<< 46400 1727204518.50970: done transferring module to remote 46400 1727204518.50973: _low_level_execute_command(): starting 46400 1727204518.50975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/ /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/AnsiballZ_package_facts.py && sleep 0' 46400 1727204518.51574: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204518.51589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204518.51604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204518.51621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204518.51661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204518.51678: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204518.51697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.51715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204518.51727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204518.51739: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204518.51752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204518.51769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204518.51786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204518.51798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204518.51809: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204518.51823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.51897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204518.51912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204518.51926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204518.51998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204518.53790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204518.53848: stderr chunk (state=3): >>><<< 46400 1727204518.53850: stdout chunk (state=3): >>><<< 46400 1727204518.53868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204518.53871: _low_level_execute_command(): starting 46400 1727204518.53876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/AnsiballZ_package_facts.py && sleep 0' 46400 1727204518.54315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204518.54321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204518.54372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.54375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204518.54377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204518.54385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204518.54432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204518.54436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204518.54493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204519.01236: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204519.01348: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204519.01390: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204519.01432: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204519.01453: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204519.01457: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204519.02986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204519.03146: stderr chunk (state=3): >>><<< 46400 1727204519.03149: stdout chunk (state=3): >>><<< 46400 1727204519.03277: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204519.07111: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204519.07138: _low_level_execute_command(): starting 46400 1727204519.07147: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204517.9901361-47059-88209660813660/ > /dev/null 2>&1 && sleep 0' 46400 1727204519.07790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204519.07805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204519.07818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204519.07839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204519.07886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204519.07898: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204519.07910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204519.07925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204519.07936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204519.07949: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204519.07960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204519.07977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204519.07993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204519.08005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204519.08016: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204519.08029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204519.08105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204519.08128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204519.08144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204519.08215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204519.10083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204519.10117: stderr chunk (state=3): >>><<< 46400 1727204519.10120: stdout chunk (state=3): >>><<< 46400 1727204519.10272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204519.10275: handler run complete 46400 1727204519.11118: variable 'ansible_facts' from source: unknown 46400 1727204519.11597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.13787: variable 'ansible_facts' from source: unknown 46400 1727204519.14368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.15318: attempt loop complete, returning result 46400 1727204519.15342: _execute() done 46400 1727204519.15349: dumping result to json 46400 1727204519.15567: done dumping result, returning 46400 1727204519.15581: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-00000000026f] 46400 1727204519.15590: sending task result for task 0affcd87-79f5-1303-fda8-00000000026f ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204519.18194: no more pending results, returning what we have 46400 1727204519.18197: results queue empty 46400 1727204519.18198: checking for any_errors_fatal 46400 1727204519.18202: done checking for any_errors_fatal 46400 1727204519.18203: checking for max_fail_percentage 46400 1727204519.18204: done checking for max_fail_percentage 46400 1727204519.18205: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.18206: done checking to see if all hosts have failed 46400 1727204519.18206: getting the remaining hosts for this loop 46400 1727204519.18208: done getting the remaining hosts for this loop 46400 1727204519.18211: getting the next task for host managed-node2 46400 1727204519.18219: done getting next task for host managed-node2 46400 1727204519.18222: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204519.18228: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.18236: getting variables 46400 1727204519.18237: in VariableManager get_vars() 46400 1727204519.18263: Calling all_inventory to load vars for managed-node2 46400 1727204519.18267: Calling groups_inventory to load vars for managed-node2 46400 1727204519.18270: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.18278: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.18286: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.18289: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.19282: done sending task result for task 0affcd87-79f5-1303-fda8-00000000026f 46400 1727204519.19287: WORKER PROCESS EXITING 46400 1727204519.19516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.21334: done with get_vars() 46400 1727204519.21361: done getting variables 46400 1727204519.21425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:59 -0400 (0:00:01.269) 0:00:09.499 ***** 46400 1727204519.21460: entering _queue_task() for managed-node2/debug 46400 1727204519.21769: worker is 1 (out of 1 available) 46400 1727204519.21782: exiting _queue_task() for managed-node2/debug 46400 1727204519.21793: done queuing things up, now waiting for results queue to drain 46400 1727204519.21794: waiting for pending results... 46400 1727204519.22059: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204519.22193: in run() - task 0affcd87-79f5-1303-fda8-00000000020d 46400 1727204519.22212: variable 'ansible_search_path' from source: unknown 46400 1727204519.22219: variable 'ansible_search_path' from source: unknown 46400 1727204519.22266: calling self._execute() 46400 1727204519.22358: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.22374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.22390: variable 'omit' from source: magic vars 46400 1727204519.22754: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.22773: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.22787: variable 'omit' from source: magic vars 46400 1727204519.22852: variable 'omit' from source: magic vars 46400 1727204519.22952: variable 'network_provider' from source: set_fact 46400 1727204519.22978: variable 'omit' from source: magic vars 46400 1727204519.23029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204519.23067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204519.23094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204519.23119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204519.23134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204519.23165: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204519.23175: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.23182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.23281: Set connection var ansible_shell_type to sh 46400 1727204519.23295: Set connection var ansible_shell_executable to /bin/sh 46400 1727204519.23304: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204519.23312: Set connection var ansible_connection to ssh 46400 1727204519.23325: Set connection var ansible_pipelining to False 46400 1727204519.23335: Set connection var ansible_timeout to 10 46400 1727204519.23361: variable 'ansible_shell_executable' from source: unknown 46400 1727204519.23372: variable 'ansible_connection' from source: unknown 46400 1727204519.23379: variable 'ansible_module_compression' from source: unknown 46400 1727204519.23386: variable 'ansible_shell_type' from source: unknown 46400 1727204519.23391: variable 'ansible_shell_executable' from source: unknown 46400 1727204519.23397: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.23404: variable 'ansible_pipelining' from source: unknown 46400 1727204519.23410: variable 'ansible_timeout' from source: unknown 46400 1727204519.23417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.23556: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204519.23573: variable 'omit' from source: magic vars 46400 1727204519.23582: starting attempt loop 46400 1727204519.23588: running the handler 46400 1727204519.23630: handler run complete 46400 1727204519.23653: attempt loop complete, returning result 46400 1727204519.23659: _execute() done 46400 1727204519.23668: dumping result to json 46400 1727204519.23675: done dumping result, returning 46400 1727204519.23685: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-00000000020d] 46400 1727204519.23694: sending task result for task 0affcd87-79f5-1303-fda8-00000000020d ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204519.23833: no more pending results, returning what we have 46400 1727204519.23837: results queue empty 46400 1727204519.23838: checking for any_errors_fatal 46400 1727204519.23848: done checking for any_errors_fatal 46400 1727204519.23848: checking for max_fail_percentage 46400 1727204519.23850: done checking for max_fail_percentage 46400 1727204519.23851: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.23852: done checking to see if all hosts have failed 46400 1727204519.23853: getting the remaining hosts for this loop 46400 1727204519.23854: done getting the remaining hosts for this loop 46400 1727204519.23858: getting the next task for host managed-node2 46400 1727204519.23870: done getting next task for host managed-node2 46400 1727204519.23874: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204519.23879: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.23891: getting variables 46400 1727204519.23892: in VariableManager get_vars() 46400 1727204519.23928: Calling all_inventory to load vars for managed-node2 46400 1727204519.23931: Calling groups_inventory to load vars for managed-node2 46400 1727204519.23933: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.23943: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.23946: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.23949: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.24982: done sending task result for task 0affcd87-79f5-1303-fda8-00000000020d 46400 1727204519.24986: WORKER PROCESS EXITING 46400 1727204519.25674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.27342: done with get_vars() 46400 1727204519.27367: done getting variables 46400 1727204519.27453: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.060) 0:00:09.559 ***** 46400 1727204519.27494: entering _queue_task() for managed-node2/fail 46400 1727204519.27496: Creating lock for fail 46400 1727204519.27767: worker is 1 (out of 1 available) 46400 1727204519.27779: exiting _queue_task() for managed-node2/fail 46400 1727204519.27790: done queuing things up, now waiting for results queue to drain 46400 1727204519.27792: waiting for pending results... 46400 1727204519.28048: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204519.28198: in run() - task 0affcd87-79f5-1303-fda8-00000000020e 46400 1727204519.28217: variable 'ansible_search_path' from source: unknown 46400 1727204519.28226: variable 'ansible_search_path' from source: unknown 46400 1727204519.28268: calling self._execute() 46400 1727204519.28352: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.28363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.28381: variable 'omit' from source: magic vars 46400 1727204519.28744: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.28761: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.28868: variable 'network_state' from source: role '' defaults 46400 1727204519.28966: Evaluated conditional (network_state != {}): False 46400 1727204519.28977: when evaluation is False, skipping this task 46400 1727204519.28988: _execute() done 46400 1727204519.28996: dumping result to json 46400 1727204519.29003: done dumping result, returning 46400 1727204519.29014: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-00000000020e] 46400 1727204519.29025: sending task result for task 0affcd87-79f5-1303-fda8-00000000020e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204519.29173: no more pending results, returning what we have 46400 1727204519.29177: results queue empty 46400 1727204519.29178: checking for any_errors_fatal 46400 1727204519.29185: done checking for any_errors_fatal 46400 1727204519.29186: checking for max_fail_percentage 46400 1727204519.29188: done checking for max_fail_percentage 46400 1727204519.29189: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.29189: done checking to see if all hosts have failed 46400 1727204519.29190: getting the remaining hosts for this loop 46400 1727204519.29192: done getting the remaining hosts for this loop 46400 1727204519.29197: getting the next task for host managed-node2 46400 1727204519.29205: done getting next task for host managed-node2 46400 1727204519.29209: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204519.29214: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.29230: getting variables 46400 1727204519.29232: in VariableManager get_vars() 46400 1727204519.29271: Calling all_inventory to load vars for managed-node2 46400 1727204519.29275: Calling groups_inventory to load vars for managed-node2 46400 1727204519.29277: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.29291: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.29294: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.29297: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.30722: done sending task result for task 0affcd87-79f5-1303-fda8-00000000020e 46400 1727204519.30726: WORKER PROCESS EXITING 46400 1727204519.31205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.33642: done with get_vars() 46400 1727204519.33671: done getting variables 46400 1727204519.33730: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.062) 0:00:09.622 ***** 46400 1727204519.33763: entering _queue_task() for managed-node2/fail 46400 1727204519.34274: worker is 1 (out of 1 available) 46400 1727204519.34287: exiting _queue_task() for managed-node2/fail 46400 1727204519.34300: done queuing things up, now waiting for results queue to drain 46400 1727204519.34302: waiting for pending results... 46400 1727204519.34612: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204519.34744: in run() - task 0affcd87-79f5-1303-fda8-00000000020f 46400 1727204519.34763: variable 'ansible_search_path' from source: unknown 46400 1727204519.34775: variable 'ansible_search_path' from source: unknown 46400 1727204519.34814: calling self._execute() 46400 1727204519.34909: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.34920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.34932: variable 'omit' from source: magic vars 46400 1727204519.35292: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.35307: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.35434: variable 'network_state' from source: role '' defaults 46400 1727204519.35449: Evaluated conditional (network_state != {}): False 46400 1727204519.35457: when evaluation is False, skipping this task 46400 1727204519.35465: _execute() done 46400 1727204519.35472: dumping result to json 46400 1727204519.35478: done dumping result, returning 46400 1727204519.35491: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-00000000020f] 46400 1727204519.35501: sending task result for task 0affcd87-79f5-1303-fda8-00000000020f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204519.35639: no more pending results, returning what we have 46400 1727204519.35644: results queue empty 46400 1727204519.35645: checking for any_errors_fatal 46400 1727204519.35652: done checking for any_errors_fatal 46400 1727204519.35653: checking for max_fail_percentage 46400 1727204519.35655: done checking for max_fail_percentage 46400 1727204519.35656: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.35657: done checking to see if all hosts have failed 46400 1727204519.35658: getting the remaining hosts for this loop 46400 1727204519.35660: done getting the remaining hosts for this loop 46400 1727204519.35665: getting the next task for host managed-node2 46400 1727204519.35674: done getting next task for host managed-node2 46400 1727204519.35679: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204519.35684: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.35701: getting variables 46400 1727204519.35703: in VariableManager get_vars() 46400 1727204519.35739: Calling all_inventory to load vars for managed-node2 46400 1727204519.35742: Calling groups_inventory to load vars for managed-node2 46400 1727204519.35744: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.35756: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.35759: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.35762: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.36858: done sending task result for task 0affcd87-79f5-1303-fda8-00000000020f 46400 1727204519.36862: WORKER PROCESS EXITING 46400 1727204519.37486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.39151: done with get_vars() 46400 1727204519.39181: done getting variables 46400 1727204519.39242: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.055) 0:00:09.677 ***** 46400 1727204519.39279: entering _queue_task() for managed-node2/fail 46400 1727204519.39582: worker is 1 (out of 1 available) 46400 1727204519.39596: exiting _queue_task() for managed-node2/fail 46400 1727204519.39609: done queuing things up, now waiting for results queue to drain 46400 1727204519.39610: waiting for pending results... 46400 1727204519.39881: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204519.40032: in run() - task 0affcd87-79f5-1303-fda8-000000000210 46400 1727204519.40056: variable 'ansible_search_path' from source: unknown 46400 1727204519.40066: variable 'ansible_search_path' from source: unknown 46400 1727204519.40105: calling self._execute() 46400 1727204519.40192: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.40202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.40214: variable 'omit' from source: magic vars 46400 1727204519.40572: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.40590: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.40745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204519.43398: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204519.43522: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204519.43570: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204519.43611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204519.43643: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204519.43730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.43773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.43806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.43856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.43881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.43983: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.44003: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204519.44010: when evaluation is False, skipping this task 46400 1727204519.44016: _execute() done 46400 1727204519.44022: dumping result to json 46400 1727204519.44028: done dumping result, returning 46400 1727204519.44039: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000000210] 46400 1727204519.44050: sending task result for task 0affcd87-79f5-1303-fda8-000000000210 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204519.44194: no more pending results, returning what we have 46400 1727204519.44199: results queue empty 46400 1727204519.44200: checking for any_errors_fatal 46400 1727204519.44207: done checking for any_errors_fatal 46400 1727204519.44208: checking for max_fail_percentage 46400 1727204519.44210: done checking for max_fail_percentage 46400 1727204519.44211: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.44212: done checking to see if all hosts have failed 46400 1727204519.44213: getting the remaining hosts for this loop 46400 1727204519.44215: done getting the remaining hosts for this loop 46400 1727204519.44219: getting the next task for host managed-node2 46400 1727204519.44229: done getting next task for host managed-node2 46400 1727204519.44234: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204519.44239: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.44254: getting variables 46400 1727204519.44255: in VariableManager get_vars() 46400 1727204519.44298: Calling all_inventory to load vars for managed-node2 46400 1727204519.44301: Calling groups_inventory to load vars for managed-node2 46400 1727204519.44304: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.44315: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.44318: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.44322: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.45124: done sending task result for task 0affcd87-79f5-1303-fda8-000000000210 46400 1727204519.45128: WORKER PROCESS EXITING 46400 1727204519.45443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.46631: done with get_vars() 46400 1727204519.46658: done getting variables 46400 1727204519.46761: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.075) 0:00:09.752 ***** 46400 1727204519.46801: entering _queue_task() for managed-node2/dnf 46400 1727204519.47106: worker is 1 (out of 1 available) 46400 1727204519.47117: exiting _queue_task() for managed-node2/dnf 46400 1727204519.47130: done queuing things up, now waiting for results queue to drain 46400 1727204519.47131: waiting for pending results... 46400 1727204519.47566: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204519.47654: in run() - task 0affcd87-79f5-1303-fda8-000000000211 46400 1727204519.47675: variable 'ansible_search_path' from source: unknown 46400 1727204519.47681: variable 'ansible_search_path' from source: unknown 46400 1727204519.47710: calling self._execute() 46400 1727204519.47780: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.47784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.47795: variable 'omit' from source: magic vars 46400 1727204519.48062: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.48077: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.48211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204519.50084: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204519.50173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204519.50217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204519.50257: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204519.50297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204519.50382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.50421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.50455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.50515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.50536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.50670: variable 'ansible_distribution' from source: facts 46400 1727204519.50681: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.50707: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204519.50837: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204519.51003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.51047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.51123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.51194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.51214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.51310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.51326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.51347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.51380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.51391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.51432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.51448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.51473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.51497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.51508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.51612: variable 'network_connections' from source: include params 46400 1727204519.51623: variable 'interface' from source: play vars 46400 1727204519.51675: variable 'interface' from source: play vars 46400 1727204519.51725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204519.51842: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204519.51873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204519.51896: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204519.51920: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204519.51951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204519.51970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204519.51991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.52010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204519.52054: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204519.52218: variable 'network_connections' from source: include params 46400 1727204519.52222: variable 'interface' from source: play vars 46400 1727204519.52270: variable 'interface' from source: play vars 46400 1727204519.52294: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204519.52297: when evaluation is False, skipping this task 46400 1727204519.52300: _execute() done 46400 1727204519.52302: dumping result to json 46400 1727204519.52304: done dumping result, returning 46400 1727204519.52311: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000211] 46400 1727204519.52316: sending task result for task 0affcd87-79f5-1303-fda8-000000000211 46400 1727204519.52404: done sending task result for task 0affcd87-79f5-1303-fda8-000000000211 46400 1727204519.52407: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204519.52452: no more pending results, returning what we have 46400 1727204519.52455: results queue empty 46400 1727204519.52457: checking for any_errors_fatal 46400 1727204519.52463: done checking for any_errors_fatal 46400 1727204519.52465: checking for max_fail_percentage 46400 1727204519.52467: done checking for max_fail_percentage 46400 1727204519.52468: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.52468: done checking to see if all hosts have failed 46400 1727204519.52469: getting the remaining hosts for this loop 46400 1727204519.52471: done getting the remaining hosts for this loop 46400 1727204519.52475: getting the next task for host managed-node2 46400 1727204519.52482: done getting next task for host managed-node2 46400 1727204519.52486: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204519.52491: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.52505: getting variables 46400 1727204519.52506: in VariableManager get_vars() 46400 1727204519.52542: Calling all_inventory to load vars for managed-node2 46400 1727204519.52544: Calling groups_inventory to load vars for managed-node2 46400 1727204519.52547: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.52557: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.52559: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.52562: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.53786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.55485: done with get_vars() 46400 1727204519.55508: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204519.55567: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.087) 0:00:09.840 ***** 46400 1727204519.55595: entering _queue_task() for managed-node2/yum 46400 1727204519.55596: Creating lock for yum 46400 1727204519.55824: worker is 1 (out of 1 available) 46400 1727204519.55838: exiting _queue_task() for managed-node2/yum 46400 1727204519.55851: done queuing things up, now waiting for results queue to drain 46400 1727204519.55852: waiting for pending results... 46400 1727204519.56024: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204519.56119: in run() - task 0affcd87-79f5-1303-fda8-000000000212 46400 1727204519.56129: variable 'ansible_search_path' from source: unknown 46400 1727204519.56133: variable 'ansible_search_path' from source: unknown 46400 1727204519.56167: calling self._execute() 46400 1727204519.56227: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.56231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.56239: variable 'omit' from source: magic vars 46400 1727204519.56512: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.56521: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.56640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204519.58750: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204519.58806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204519.58833: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204519.58859: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204519.58880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204519.58942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.58965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.58982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.59013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.59025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.59092: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.59103: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204519.59110: when evaluation is False, skipping this task 46400 1727204519.59117: _execute() done 46400 1727204519.59122: dumping result to json 46400 1727204519.59124: done dumping result, returning 46400 1727204519.59131: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000212] 46400 1727204519.59137: sending task result for task 0affcd87-79f5-1303-fda8-000000000212 46400 1727204519.59224: done sending task result for task 0affcd87-79f5-1303-fda8-000000000212 46400 1727204519.59227: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204519.59277: no more pending results, returning what we have 46400 1727204519.59280: results queue empty 46400 1727204519.59282: checking for any_errors_fatal 46400 1727204519.59287: done checking for any_errors_fatal 46400 1727204519.59288: checking for max_fail_percentage 46400 1727204519.59290: done checking for max_fail_percentage 46400 1727204519.59291: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.59292: done checking to see if all hosts have failed 46400 1727204519.59292: getting the remaining hosts for this loop 46400 1727204519.59294: done getting the remaining hosts for this loop 46400 1727204519.59297: getting the next task for host managed-node2 46400 1727204519.59306: done getting next task for host managed-node2 46400 1727204519.59310: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204519.59315: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.59328: getting variables 46400 1727204519.59330: in VariableManager get_vars() 46400 1727204519.59366: Calling all_inventory to load vars for managed-node2 46400 1727204519.59369: Calling groups_inventory to load vars for managed-node2 46400 1727204519.59372: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.59382: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.59384: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.59387: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.60288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.61190: done with get_vars() 46400 1727204519.61206: done getting variables 46400 1727204519.61248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.056) 0:00:09.897 ***** 46400 1727204519.61275: entering _queue_task() for managed-node2/fail 46400 1727204519.61482: worker is 1 (out of 1 available) 46400 1727204519.61495: exiting _queue_task() for managed-node2/fail 46400 1727204519.61507: done queuing things up, now waiting for results queue to drain 46400 1727204519.61509: waiting for pending results... 46400 1727204519.61683: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204519.61772: in run() - task 0affcd87-79f5-1303-fda8-000000000213 46400 1727204519.61783: variable 'ansible_search_path' from source: unknown 46400 1727204519.61787: variable 'ansible_search_path' from source: unknown 46400 1727204519.61814: calling self._execute() 46400 1727204519.61882: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.61886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.61895: variable 'omit' from source: magic vars 46400 1727204519.62161: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.62174: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.62253: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204519.62393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204519.64029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204519.64084: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204519.64111: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204519.64139: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204519.64162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204519.64217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.64239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.64259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.64294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.64305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.64339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.64358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.64382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.64407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.64417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.64447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.64468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.64488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.64889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.64892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.64895: variable 'network_connections' from source: include params 46400 1727204519.64897: variable 'interface' from source: play vars 46400 1727204519.64899: variable 'interface' from source: play vars 46400 1727204519.64901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204519.65017: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204519.65052: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204519.65086: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204519.65113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204519.65153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204519.65179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204519.65203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.65228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204519.65294: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204519.65537: variable 'network_connections' from source: include params 46400 1727204519.65541: variable 'interface' from source: play vars 46400 1727204519.65608: variable 'interface' from source: play vars 46400 1727204519.65640: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204519.65643: when evaluation is False, skipping this task 46400 1727204519.65646: _execute() done 46400 1727204519.65649: dumping result to json 46400 1727204519.65651: done dumping result, returning 46400 1727204519.65659: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000213] 46400 1727204519.65669: sending task result for task 0affcd87-79f5-1303-fda8-000000000213 46400 1727204519.65759: done sending task result for task 0affcd87-79f5-1303-fda8-000000000213 46400 1727204519.65762: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204519.65831: no more pending results, returning what we have 46400 1727204519.65835: results queue empty 46400 1727204519.65836: checking for any_errors_fatal 46400 1727204519.65843: done checking for any_errors_fatal 46400 1727204519.65844: checking for max_fail_percentage 46400 1727204519.65846: done checking for max_fail_percentage 46400 1727204519.65847: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.65847: done checking to see if all hosts have failed 46400 1727204519.65848: getting the remaining hosts for this loop 46400 1727204519.65850: done getting the remaining hosts for this loop 46400 1727204519.65853: getting the next task for host managed-node2 46400 1727204519.65861: done getting next task for host managed-node2 46400 1727204519.65867: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204519.65871: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.65887: getting variables 46400 1727204519.65889: in VariableManager get_vars() 46400 1727204519.65922: Calling all_inventory to load vars for managed-node2 46400 1727204519.65925: Calling groups_inventory to load vars for managed-node2 46400 1727204519.65927: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.65936: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.65938: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.65940: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.67073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.67976: done with get_vars() 46400 1727204519.67997: done getting variables 46400 1727204519.68044: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.067) 0:00:09.965 ***** 46400 1727204519.68071: entering _queue_task() for managed-node2/package 46400 1727204519.68299: worker is 1 (out of 1 available) 46400 1727204519.68315: exiting _queue_task() for managed-node2/package 46400 1727204519.68329: done queuing things up, now waiting for results queue to drain 46400 1727204519.68330: waiting for pending results... 46400 1727204519.68527: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204519.68679: in run() - task 0affcd87-79f5-1303-fda8-000000000214 46400 1727204519.68700: variable 'ansible_search_path' from source: unknown 46400 1727204519.68707: variable 'ansible_search_path' from source: unknown 46400 1727204519.68744: calling self._execute() 46400 1727204519.68835: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.68848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.68860: variable 'omit' from source: magic vars 46400 1727204519.69233: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.69251: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.69452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204519.69717: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204519.69765: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204519.69803: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204519.69840: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204519.69953: variable 'network_packages' from source: role '' defaults 46400 1727204519.70068: variable '__network_provider_setup' from source: role '' defaults 46400 1727204519.70083: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204519.70168: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204519.70181: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204519.70249: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204519.70440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204519.73014: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204519.73093: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204519.73143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204519.73182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204519.73213: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204519.73300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.73332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.73368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.73414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.73434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.73487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.73513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.73541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.73589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.73607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.73849: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204519.73971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.74001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.74034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.74109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.74131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.74244: variable 'ansible_python' from source: facts 46400 1727204519.74291: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204519.74408: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204519.74509: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204519.74676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.74709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.74738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.74788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.74805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.74854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204519.74899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204519.74926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.74973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204519.74996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204519.75146: variable 'network_connections' from source: include params 46400 1727204519.75157: variable 'interface' from source: play vars 46400 1727204519.75274: variable 'interface' from source: play vars 46400 1727204519.75353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204519.75389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204519.75427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204519.75465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204519.75516: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204519.75822: variable 'network_connections' from source: include params 46400 1727204519.75831: variable 'interface' from source: play vars 46400 1727204519.75942: variable 'interface' from source: play vars 46400 1727204519.76010: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204519.76099: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204519.76449: variable 'network_connections' from source: include params 46400 1727204519.76475: variable 'interface' from source: play vars 46400 1727204519.76650: variable 'interface' from source: play vars 46400 1727204519.76682: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204519.76773: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204519.77106: variable 'network_connections' from source: include params 46400 1727204519.77115: variable 'interface' from source: play vars 46400 1727204519.77186: variable 'interface' from source: play vars 46400 1727204519.77244: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204519.77308: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204519.77318: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204519.77378: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204519.77597: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204519.78107: variable 'network_connections' from source: include params 46400 1727204519.78117: variable 'interface' from source: play vars 46400 1727204519.78186: variable 'interface' from source: play vars 46400 1727204519.78200: variable 'ansible_distribution' from source: facts 46400 1727204519.78207: variable '__network_rh_distros' from source: role '' defaults 46400 1727204519.78216: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.78247: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204519.78428: variable 'ansible_distribution' from source: facts 46400 1727204519.78437: variable '__network_rh_distros' from source: role '' defaults 46400 1727204519.78447: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.78462: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204519.78640: variable 'ansible_distribution' from source: facts 46400 1727204519.78648: variable '__network_rh_distros' from source: role '' defaults 46400 1727204519.78657: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.78702: variable 'network_provider' from source: set_fact 46400 1727204519.78725: variable 'ansible_facts' from source: unknown 46400 1727204519.79445: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204519.79456: when evaluation is False, skipping this task 46400 1727204519.79468: _execute() done 46400 1727204519.79477: dumping result to json 46400 1727204519.79484: done dumping result, returning 46400 1727204519.79494: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000000214] 46400 1727204519.79504: sending task result for task 0affcd87-79f5-1303-fda8-000000000214 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204519.79655: no more pending results, returning what we have 46400 1727204519.79662: results queue empty 46400 1727204519.79665: checking for any_errors_fatal 46400 1727204519.79671: done checking for any_errors_fatal 46400 1727204519.79672: checking for max_fail_percentage 46400 1727204519.79674: done checking for max_fail_percentage 46400 1727204519.79675: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.79676: done checking to see if all hosts have failed 46400 1727204519.79676: getting the remaining hosts for this loop 46400 1727204519.79678: done getting the remaining hosts for this loop 46400 1727204519.79682: getting the next task for host managed-node2 46400 1727204519.79691: done getting next task for host managed-node2 46400 1727204519.79695: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204519.79701: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.79715: getting variables 46400 1727204519.79717: in VariableManager get_vars() 46400 1727204519.79756: Calling all_inventory to load vars for managed-node2 46400 1727204519.79762: Calling groups_inventory to load vars for managed-node2 46400 1727204519.79772: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.79783: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.79786: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.79789: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.80783: done sending task result for task 0affcd87-79f5-1303-fda8-000000000214 46400 1727204519.80786: WORKER PROCESS EXITING 46400 1727204519.81614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.83308: done with get_vars() 46400 1727204519.83337: done getting variables 46400 1727204519.83404: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.153) 0:00:10.118 ***** 46400 1727204519.83440: entering _queue_task() for managed-node2/package 46400 1727204519.83755: worker is 1 (out of 1 available) 46400 1727204519.83771: exiting _queue_task() for managed-node2/package 46400 1727204519.83784: done queuing things up, now waiting for results queue to drain 46400 1727204519.83786: waiting for pending results... 46400 1727204519.84056: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204519.84217: in run() - task 0affcd87-79f5-1303-fda8-000000000215 46400 1727204519.84241: variable 'ansible_search_path' from source: unknown 46400 1727204519.84248: variable 'ansible_search_path' from source: unknown 46400 1727204519.84294: calling self._execute() 46400 1727204519.84393: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.84404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.84417: variable 'omit' from source: magic vars 46400 1727204519.84902: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.84919: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.85173: variable 'network_state' from source: role '' defaults 46400 1727204519.85218: Evaluated conditional (network_state != {}): False 46400 1727204519.85289: when evaluation is False, skipping this task 46400 1727204519.85296: _execute() done 46400 1727204519.85303: dumping result to json 46400 1727204519.85311: done dumping result, returning 46400 1727204519.85323: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000215] 46400 1727204519.85334: sending task result for task 0affcd87-79f5-1303-fda8-000000000215 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204519.85501: no more pending results, returning what we have 46400 1727204519.85507: results queue empty 46400 1727204519.85508: checking for any_errors_fatal 46400 1727204519.85517: done checking for any_errors_fatal 46400 1727204519.85518: checking for max_fail_percentage 46400 1727204519.85520: done checking for max_fail_percentage 46400 1727204519.85521: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.85522: done checking to see if all hosts have failed 46400 1727204519.85523: getting the remaining hosts for this loop 46400 1727204519.85525: done getting the remaining hosts for this loop 46400 1727204519.85529: getting the next task for host managed-node2 46400 1727204519.85539: done getting next task for host managed-node2 46400 1727204519.85544: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204519.85550: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.85571: getting variables 46400 1727204519.85573: in VariableManager get_vars() 46400 1727204519.85614: Calling all_inventory to load vars for managed-node2 46400 1727204519.85617: Calling groups_inventory to load vars for managed-node2 46400 1727204519.85619: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.85633: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.85636: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.85640: Calling groups_plugins_play to load vars for managed-node2 46400 1727204519.87551: done sending task result for task 0affcd87-79f5-1303-fda8-000000000215 46400 1727204519.87559: WORKER PROCESS EXITING 46400 1727204519.88178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204519.92198: done with get_vars() 46400 1727204519.92233: done getting variables 46400 1727204519.92299: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.088) 0:00:10.207 ***** 46400 1727204519.92335: entering _queue_task() for managed-node2/package 46400 1727204519.92640: worker is 1 (out of 1 available) 46400 1727204519.92655: exiting _queue_task() for managed-node2/package 46400 1727204519.92672: done queuing things up, now waiting for results queue to drain 46400 1727204519.92674: waiting for pending results... 46400 1727204519.93281: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204519.93422: in run() - task 0affcd87-79f5-1303-fda8-000000000216 46400 1727204519.93573: variable 'ansible_search_path' from source: unknown 46400 1727204519.93581: variable 'ansible_search_path' from source: unknown 46400 1727204519.93619: calling self._execute() 46400 1727204519.93748: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204519.93890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204519.93903: variable 'omit' from source: magic vars 46400 1727204519.94595: variable 'ansible_distribution_major_version' from source: facts 46400 1727204519.94651: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204519.94895: variable 'network_state' from source: role '' defaults 46400 1727204519.94909: Evaluated conditional (network_state != {}): False 46400 1727204519.94973: when evaluation is False, skipping this task 46400 1727204519.94981: _execute() done 46400 1727204519.94989: dumping result to json 46400 1727204519.94996: done dumping result, returning 46400 1727204519.95007: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000216] 46400 1727204519.95018: sending task result for task 0affcd87-79f5-1303-fda8-000000000216 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204519.95180: no more pending results, returning what we have 46400 1727204519.95184: results queue empty 46400 1727204519.95185: checking for any_errors_fatal 46400 1727204519.95192: done checking for any_errors_fatal 46400 1727204519.95193: checking for max_fail_percentage 46400 1727204519.95195: done checking for max_fail_percentage 46400 1727204519.95195: checking to see if all hosts have failed and the running result is not ok 46400 1727204519.95196: done checking to see if all hosts have failed 46400 1727204519.95197: getting the remaining hosts for this loop 46400 1727204519.95199: done getting the remaining hosts for this loop 46400 1727204519.95204: getting the next task for host managed-node2 46400 1727204519.95213: done getting next task for host managed-node2 46400 1727204519.95217: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204519.95223: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204519.95241: getting variables 46400 1727204519.95243: in VariableManager get_vars() 46400 1727204519.95283: Calling all_inventory to load vars for managed-node2 46400 1727204519.95287: Calling groups_inventory to load vars for managed-node2 46400 1727204519.95289: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204519.95302: Calling all_plugins_play to load vars for managed-node2 46400 1727204519.95305: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204519.95307: Calling groups_plugins_play to load vars for managed-node2 46400 1727204520.02834: done sending task result for task 0affcd87-79f5-1303-fda8-000000000216 46400 1727204520.02839: WORKER PROCESS EXITING 46400 1727204520.03500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204520.05192: done with get_vars() 46400 1727204520.05218: done getting variables 46400 1727204520.05307: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:00 -0400 (0:00:00.130) 0:00:10.337 ***** 46400 1727204520.05337: entering _queue_task() for managed-node2/service 46400 1727204520.05338: Creating lock for service 46400 1727204520.05670: worker is 1 (out of 1 available) 46400 1727204520.05685: exiting _queue_task() for managed-node2/service 46400 1727204520.05698: done queuing things up, now waiting for results queue to drain 46400 1727204520.05700: waiting for pending results... 46400 1727204520.05977: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204520.06134: in run() - task 0affcd87-79f5-1303-fda8-000000000217 46400 1727204520.06158: variable 'ansible_search_path' from source: unknown 46400 1727204520.06172: variable 'ansible_search_path' from source: unknown 46400 1727204520.06210: calling self._execute() 46400 1727204520.06305: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204520.06319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204520.06332: variable 'omit' from source: magic vars 46400 1727204520.06727: variable 'ansible_distribution_major_version' from source: facts 46400 1727204520.06746: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204520.06887: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204520.07103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204520.09503: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204520.09592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204520.09637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204520.09681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204520.09712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204520.09802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.09836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.09877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.09924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.09943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.09999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.10025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.10054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.10105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.10125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.10178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.10207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.10237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.10290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.10310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.10493: variable 'network_connections' from source: include params 46400 1727204520.10511: variable 'interface' from source: play vars 46400 1727204520.10591: variable 'interface' from source: play vars 46400 1727204520.10671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204520.10969: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204520.11010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204520.11157: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204520.11195: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204520.11245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204520.11280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204520.11398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.11430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204520.11529: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204520.12067: variable 'network_connections' from source: include params 46400 1727204520.12130: variable 'interface' from source: play vars 46400 1727204520.12200: variable 'interface' from source: play vars 46400 1727204520.12344: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204520.12371: when evaluation is False, skipping this task 46400 1727204520.12398: _execute() done 46400 1727204520.12406: dumping result to json 46400 1727204520.12413: done dumping result, returning 46400 1727204520.12444: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000217] 46400 1727204520.12467: sending task result for task 0affcd87-79f5-1303-fda8-000000000217 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204520.12636: no more pending results, returning what we have 46400 1727204520.12640: results queue empty 46400 1727204520.12641: checking for any_errors_fatal 46400 1727204520.12648: done checking for any_errors_fatal 46400 1727204520.12649: checking for max_fail_percentage 46400 1727204520.12650: done checking for max_fail_percentage 46400 1727204520.12651: checking to see if all hosts have failed and the running result is not ok 46400 1727204520.12652: done checking to see if all hosts have failed 46400 1727204520.12653: getting the remaining hosts for this loop 46400 1727204520.12655: done getting the remaining hosts for this loop 46400 1727204520.12659: getting the next task for host managed-node2 46400 1727204520.12673: done getting next task for host managed-node2 46400 1727204520.12677: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204520.12681: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204520.12697: getting variables 46400 1727204520.12698: in VariableManager get_vars() 46400 1727204520.12736: Calling all_inventory to load vars for managed-node2 46400 1727204520.12739: Calling groups_inventory to load vars for managed-node2 46400 1727204520.12741: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204520.12752: Calling all_plugins_play to load vars for managed-node2 46400 1727204520.12754: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204520.12757: Calling groups_plugins_play to load vars for managed-node2 46400 1727204520.13785: done sending task result for task 0affcd87-79f5-1303-fda8-000000000217 46400 1727204520.13789: WORKER PROCESS EXITING 46400 1727204520.14797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204520.16945: done with get_vars() 46400 1727204520.16976: done getting variables 46400 1727204520.17044: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:00 -0400 (0:00:00.117) 0:00:10.455 ***** 46400 1727204520.17087: entering _queue_task() for managed-node2/service 46400 1727204520.17438: worker is 1 (out of 1 available) 46400 1727204520.17452: exiting _queue_task() for managed-node2/service 46400 1727204520.17473: done queuing things up, now waiting for results queue to drain 46400 1727204520.17475: waiting for pending results... 46400 1727204520.17775: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204520.17957: in run() - task 0affcd87-79f5-1303-fda8-000000000218 46400 1727204520.17991: variable 'ansible_search_path' from source: unknown 46400 1727204520.18026: variable 'ansible_search_path' from source: unknown 46400 1727204520.18138: calling self._execute() 46400 1727204520.18471: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204520.18476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204520.18479: variable 'omit' from source: magic vars 46400 1727204520.18623: variable 'ansible_distribution_major_version' from source: facts 46400 1727204520.18638: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204520.18830: variable 'network_provider' from source: set_fact 46400 1727204520.18834: variable 'network_state' from source: role '' defaults 46400 1727204520.18847: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204520.18853: variable 'omit' from source: magic vars 46400 1727204520.18918: variable 'omit' from source: magic vars 46400 1727204520.18945: variable 'network_service_name' from source: role '' defaults 46400 1727204520.19016: variable 'network_service_name' from source: role '' defaults 46400 1727204520.19120: variable '__network_provider_setup' from source: role '' defaults 46400 1727204520.19125: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204520.19191: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204520.19198: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204520.19260: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204520.19485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204520.21802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204520.21855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204520.21887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204520.21914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204520.21934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204520.22046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.22055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.22095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.22121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.22151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.22206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.22234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.22978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.22982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.22985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.22987: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204520.22990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.22992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.22994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.22996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.22999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.23001: variable 'ansible_python' from source: facts 46400 1727204520.23003: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204520.23005: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204520.23122: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204520.23169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.23197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.23214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.23260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.23275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.23331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204520.23360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204520.23391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.23923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204520.23927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204520.23929: variable 'network_connections' from source: include params 46400 1727204520.23932: variable 'interface' from source: play vars 46400 1727204520.23934: variable 'interface' from source: play vars 46400 1727204520.24057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204520.24385: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204520.24551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204520.24613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204520.24762: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204520.24840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204520.24987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204520.25019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204520.25065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204520.25152: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204520.25471: variable 'network_connections' from source: include params 46400 1727204520.25482: variable 'interface' from source: play vars 46400 1727204520.25568: variable 'interface' from source: play vars 46400 1727204520.25625: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204520.25711: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204520.26035: variable 'network_connections' from source: include params 46400 1727204520.26050: variable 'interface' from source: play vars 46400 1727204520.26126: variable 'interface' from source: play vars 46400 1727204520.26161: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204520.26234: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204520.26438: variable 'network_connections' from source: include params 46400 1727204520.26442: variable 'interface' from source: play vars 46400 1727204520.26496: variable 'interface' from source: play vars 46400 1727204520.26540: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204520.26589: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204520.26592: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204520.26635: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204520.26772: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204520.27383: variable 'network_connections' from source: include params 46400 1727204520.27393: variable 'interface' from source: play vars 46400 1727204520.27466: variable 'interface' from source: play vars 46400 1727204520.27483: variable 'ansible_distribution' from source: facts 46400 1727204520.27490: variable '__network_rh_distros' from source: role '' defaults 46400 1727204520.27499: variable 'ansible_distribution_major_version' from source: facts 46400 1727204520.27539: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204520.27748: variable 'ansible_distribution' from source: facts 46400 1727204520.27757: variable '__network_rh_distros' from source: role '' defaults 46400 1727204520.27772: variable 'ansible_distribution_major_version' from source: facts 46400 1727204520.27787: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204520.28074: variable 'ansible_distribution' from source: facts 46400 1727204520.28078: variable '__network_rh_distros' from source: role '' defaults 46400 1727204520.28080: variable 'ansible_distribution_major_version' from source: facts 46400 1727204520.28420: variable 'network_provider' from source: set_fact 46400 1727204520.28423: variable 'omit' from source: magic vars 46400 1727204520.28425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204520.28428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204520.28430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204520.28432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204520.28434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204520.28436: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204520.28437: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204520.28440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204520.28443: Set connection var ansible_shell_type to sh 46400 1727204520.28445: Set connection var ansible_shell_executable to /bin/sh 46400 1727204520.28447: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204520.28449: Set connection var ansible_connection to ssh 46400 1727204520.28451: Set connection var ansible_pipelining to False 46400 1727204520.28453: Set connection var ansible_timeout to 10 46400 1727204520.28456: variable 'ansible_shell_executable' from source: unknown 46400 1727204520.28458: variable 'ansible_connection' from source: unknown 46400 1727204520.28462: variable 'ansible_module_compression' from source: unknown 46400 1727204520.28467: variable 'ansible_shell_type' from source: unknown 46400 1727204520.28469: variable 'ansible_shell_executable' from source: unknown 46400 1727204520.28471: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204520.28473: variable 'ansible_pipelining' from source: unknown 46400 1727204520.28474: variable 'ansible_timeout' from source: unknown 46400 1727204520.28476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204520.29018: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204520.29026: variable 'omit' from source: magic vars 46400 1727204520.29028: starting attempt loop 46400 1727204520.29030: running the handler 46400 1727204520.29032: variable 'ansible_facts' from source: unknown 46400 1727204520.29582: _low_level_execute_command(): starting 46400 1727204520.29585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204520.30107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204520.30119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.30130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.30143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.30183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.30191: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204520.30219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.30222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204520.30225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204520.30227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204520.30232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.30251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.30254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.30256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.30268: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204520.30285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.30344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204520.30369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204520.30375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204520.30457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204520.32096: stdout chunk (state=3): >>>/root <<< 46400 1727204520.32201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204520.32293: stderr chunk (state=3): >>><<< 46400 1727204520.32297: stdout chunk (state=3): >>><<< 46400 1727204520.32304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204520.32321: _low_level_execute_command(): starting 46400 1727204520.32325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970 `" && echo ansible-tmp-1727204520.323052-47148-272799539018970="` echo /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970 `" ) && sleep 0' 46400 1727204520.32976: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204520.32985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.32995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.33008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.33045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.33052: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204520.33069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.33082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204520.33085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204520.33093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204520.33100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.33110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.33121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.33129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.33135: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204520.33144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.33227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204520.33250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204520.33270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204520.33350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204520.35195: stdout chunk (state=3): >>>ansible-tmp-1727204520.323052-47148-272799539018970=/root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970 <<< 46400 1727204520.35321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204520.35402: stderr chunk (state=3): >>><<< 46400 1727204520.35417: stdout chunk (state=3): >>><<< 46400 1727204520.35478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204520.323052-47148-272799539018970=/root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204520.35487: variable 'ansible_module_compression' from source: unknown 46400 1727204520.35583: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 46400 1727204520.35587: ANSIBALLZ: Acquiring lock 46400 1727204520.35589: ANSIBALLZ: Lock acquired: 140519374124768 46400 1727204520.35592: ANSIBALLZ: Creating module 46400 1727204520.62294: ANSIBALLZ: Writing module into payload 46400 1727204520.62510: ANSIBALLZ: Writing module 46400 1727204520.62545: ANSIBALLZ: Renaming module 46400 1727204520.62557: ANSIBALLZ: Done creating module 46400 1727204520.62600: variable 'ansible_facts' from source: unknown 46400 1727204520.62796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/AnsiballZ_systemd.py 46400 1727204520.62959: Sending initial data 46400 1727204520.62962: Sent initial data (155 bytes) 46400 1727204520.64008: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204520.64016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.64027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.64041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.64087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.64096: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204520.64106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.64121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204520.64128: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204520.64141: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204520.64143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.64151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.64167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.64175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.64182: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204520.64192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.64272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204520.64288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204520.64291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204520.64424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204520.66219: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204520.66259: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204520.66297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp0xmywtig /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/AnsiballZ_systemd.py <<< 46400 1727204520.66335: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204520.68899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204520.68984: stderr chunk (state=3): >>><<< 46400 1727204520.68988: stdout chunk (state=3): >>><<< 46400 1727204520.69011: done transferring module to remote 46400 1727204520.69031: _low_level_execute_command(): starting 46400 1727204520.69034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/ /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/AnsiballZ_systemd.py && sleep 0' 46400 1727204520.69941: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204520.69951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.69966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.69977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.70022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.70029: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204520.70039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.70053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204520.70065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204520.70071: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204520.70080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.70089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.70101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.70108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.70117: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204520.70134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.70209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204520.70230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204520.70248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204520.70310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204520.72116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204520.72120: stdout chunk (state=3): >>><<< 46400 1727204520.72126: stderr chunk (state=3): >>><<< 46400 1727204520.72146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204520.72154: _low_level_execute_command(): starting 46400 1727204520.72157: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/AnsiballZ_systemd.py && sleep 0' 46400 1727204520.72819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204520.72828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.72838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.72852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.72893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.72904: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204520.72919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.72955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204520.72958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204520.72965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204520.72968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204520.72979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204520.72987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204520.72995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204520.73002: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204520.73015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204520.73141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204520.73144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204520.73146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204520.73269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204520.98434: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204520.98481: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6967296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1985870000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204520.98498: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204520.99979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204521.00040: stderr chunk (state=3): >>><<< 46400 1727204521.00043: stdout chunk (state=3): >>><<< 46400 1727204521.00062: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6967296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1985870000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204521.00176: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204521.00192: _low_level_execute_command(): starting 46400 1727204521.00197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204520.323052-47148-272799539018970/ > /dev/null 2>&1 && sleep 0' 46400 1727204521.00686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.00692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.00725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.00738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.00793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.00805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.00850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.02646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.02706: stderr chunk (state=3): >>><<< 46400 1727204521.02709: stdout chunk (state=3): >>><<< 46400 1727204521.02722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204521.02728: handler run complete 46400 1727204521.02767: attempt loop complete, returning result 46400 1727204521.02770: _execute() done 46400 1727204521.02773: dumping result to json 46400 1727204521.02789: done dumping result, returning 46400 1727204521.02799: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000000218] 46400 1727204521.02802: sending task result for task 0affcd87-79f5-1303-fda8-000000000218 46400 1727204521.03035: done sending task result for task 0affcd87-79f5-1303-fda8-000000000218 46400 1727204521.03037: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204521.03096: no more pending results, returning what we have 46400 1727204521.03100: results queue empty 46400 1727204521.03101: checking for any_errors_fatal 46400 1727204521.03106: done checking for any_errors_fatal 46400 1727204521.03106: checking for max_fail_percentage 46400 1727204521.03108: done checking for max_fail_percentage 46400 1727204521.03109: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.03110: done checking to see if all hosts have failed 46400 1727204521.03110: getting the remaining hosts for this loop 46400 1727204521.03112: done getting the remaining hosts for this loop 46400 1727204521.03116: getting the next task for host managed-node2 46400 1727204521.03123: done getting next task for host managed-node2 46400 1727204521.03127: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204521.03133: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.03142: getting variables 46400 1727204521.03144: in VariableManager get_vars() 46400 1727204521.03177: Calling all_inventory to load vars for managed-node2 46400 1727204521.03180: Calling groups_inventory to load vars for managed-node2 46400 1727204521.03182: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.03191: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.03193: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.03195: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.03996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.04933: done with get_vars() 46400 1727204521.04950: done getting variables 46400 1727204521.05001: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.879) 0:00:11.334 ***** 46400 1727204521.05031: entering _queue_task() for managed-node2/service 46400 1727204521.05268: worker is 1 (out of 1 available) 46400 1727204521.05283: exiting _queue_task() for managed-node2/service 46400 1727204521.05295: done queuing things up, now waiting for results queue to drain 46400 1727204521.05296: waiting for pending results... 46400 1727204521.05466: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204521.05553: in run() - task 0affcd87-79f5-1303-fda8-000000000219 46400 1727204521.05565: variable 'ansible_search_path' from source: unknown 46400 1727204521.05574: variable 'ansible_search_path' from source: unknown 46400 1727204521.05604: calling self._execute() 46400 1727204521.05684: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.05690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.05698: variable 'omit' from source: magic vars 46400 1727204521.05980: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.05990: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.06076: variable 'network_provider' from source: set_fact 46400 1727204521.06080: Evaluated conditional (network_provider == "nm"): True 46400 1727204521.06140: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204521.06207: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204521.06326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204521.08122: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204521.08169: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204521.08198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204521.08222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204521.08243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204521.08305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204521.08325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204521.08344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204521.08378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204521.08388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204521.08420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204521.08436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204521.08455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204521.08488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204521.08499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204521.08527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204521.08543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204521.08559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204521.08591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204521.08602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204521.08698: variable 'network_connections' from source: include params 46400 1727204521.08708: variable 'interface' from source: play vars 46400 1727204521.08758: variable 'interface' from source: play vars 46400 1727204521.08815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204521.08936: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204521.08963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204521.08988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204521.09011: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204521.09043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204521.09058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204521.09079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204521.09097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204521.09137: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204521.09292: variable 'network_connections' from source: include params 46400 1727204521.09296: variable 'interface' from source: play vars 46400 1727204521.09343: variable 'interface' from source: play vars 46400 1727204521.09375: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204521.09378: when evaluation is False, skipping this task 46400 1727204521.09381: _execute() done 46400 1727204521.09383: dumping result to json 46400 1727204521.09386: done dumping result, returning 46400 1727204521.09392: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000000219] 46400 1727204521.09406: sending task result for task 0affcd87-79f5-1303-fda8-000000000219 46400 1727204521.09494: done sending task result for task 0affcd87-79f5-1303-fda8-000000000219 46400 1727204521.09496: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204521.09537: no more pending results, returning what we have 46400 1727204521.09541: results queue empty 46400 1727204521.09542: checking for any_errors_fatal 46400 1727204521.09572: done checking for any_errors_fatal 46400 1727204521.09574: checking for max_fail_percentage 46400 1727204521.09575: done checking for max_fail_percentage 46400 1727204521.09576: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.09577: done checking to see if all hosts have failed 46400 1727204521.09578: getting the remaining hosts for this loop 46400 1727204521.09579: done getting the remaining hosts for this loop 46400 1727204521.09583: getting the next task for host managed-node2 46400 1727204521.09591: done getting next task for host managed-node2 46400 1727204521.09595: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204521.09600: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.09614: getting variables 46400 1727204521.09615: in VariableManager get_vars() 46400 1727204521.09650: Calling all_inventory to load vars for managed-node2 46400 1727204521.09653: Calling groups_inventory to load vars for managed-node2 46400 1727204521.09655: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.09668: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.09671: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.09679: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.10597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.11513: done with get_vars() 46400 1727204521.11531: done getting variables 46400 1727204521.11578: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.065) 0:00:11.400 ***** 46400 1727204521.11603: entering _queue_task() for managed-node2/service 46400 1727204521.11830: worker is 1 (out of 1 available) 46400 1727204521.11844: exiting _queue_task() for managed-node2/service 46400 1727204521.11857: done queuing things up, now waiting for results queue to drain 46400 1727204521.11859: waiting for pending results... 46400 1727204521.12043: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204521.12142: in run() - task 0affcd87-79f5-1303-fda8-00000000021a 46400 1727204521.12153: variable 'ansible_search_path' from source: unknown 46400 1727204521.12156: variable 'ansible_search_path' from source: unknown 46400 1727204521.12190: calling self._execute() 46400 1727204521.12255: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.12259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.12274: variable 'omit' from source: magic vars 46400 1727204521.12548: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.12558: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.12642: variable 'network_provider' from source: set_fact 46400 1727204521.12646: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204521.12649: when evaluation is False, skipping this task 46400 1727204521.12651: _execute() done 46400 1727204521.12654: dumping result to json 46400 1727204521.12656: done dumping result, returning 46400 1727204521.12668: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-00000000021a] 46400 1727204521.12673: sending task result for task 0affcd87-79f5-1303-fda8-00000000021a 46400 1727204521.12759: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021a 46400 1727204521.12762: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204521.12808: no more pending results, returning what we have 46400 1727204521.12812: results queue empty 46400 1727204521.12813: checking for any_errors_fatal 46400 1727204521.12822: done checking for any_errors_fatal 46400 1727204521.12823: checking for max_fail_percentage 46400 1727204521.12825: done checking for max_fail_percentage 46400 1727204521.12825: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.12826: done checking to see if all hosts have failed 46400 1727204521.12827: getting the remaining hosts for this loop 46400 1727204521.12829: done getting the remaining hosts for this loop 46400 1727204521.12832: getting the next task for host managed-node2 46400 1727204521.12842: done getting next task for host managed-node2 46400 1727204521.12846: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204521.12851: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.12868: getting variables 46400 1727204521.12870: in VariableManager get_vars() 46400 1727204521.12907: Calling all_inventory to load vars for managed-node2 46400 1727204521.12910: Calling groups_inventory to load vars for managed-node2 46400 1727204521.12912: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.12921: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.12923: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.12925: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.13701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.14730: done with get_vars() 46400 1727204521.14744: done getting variables 46400 1727204521.14788: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.032) 0:00:11.432 ***** 46400 1727204521.14812: entering _queue_task() for managed-node2/copy 46400 1727204521.15030: worker is 1 (out of 1 available) 46400 1727204521.15044: exiting _queue_task() for managed-node2/copy 46400 1727204521.15056: done queuing things up, now waiting for results queue to drain 46400 1727204521.15058: waiting for pending results... 46400 1727204521.15235: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204521.15329: in run() - task 0affcd87-79f5-1303-fda8-00000000021b 46400 1727204521.15341: variable 'ansible_search_path' from source: unknown 46400 1727204521.15344: variable 'ansible_search_path' from source: unknown 46400 1727204521.15378: calling self._execute() 46400 1727204521.15443: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.15447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.15456: variable 'omit' from source: magic vars 46400 1727204521.15727: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.15737: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.15819: variable 'network_provider' from source: set_fact 46400 1727204521.15823: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204521.15826: when evaluation is False, skipping this task 46400 1727204521.15828: _execute() done 46400 1727204521.15831: dumping result to json 46400 1727204521.15833: done dumping result, returning 46400 1727204521.15841: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-00000000021b] 46400 1727204521.15846: sending task result for task 0affcd87-79f5-1303-fda8-00000000021b 46400 1727204521.15936: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021b 46400 1727204521.15938: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204521.15991: no more pending results, returning what we have 46400 1727204521.15996: results queue empty 46400 1727204521.15997: checking for any_errors_fatal 46400 1727204521.16006: done checking for any_errors_fatal 46400 1727204521.16007: checking for max_fail_percentage 46400 1727204521.16008: done checking for max_fail_percentage 46400 1727204521.16009: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.16010: done checking to see if all hosts have failed 46400 1727204521.16011: getting the remaining hosts for this loop 46400 1727204521.16012: done getting the remaining hosts for this loop 46400 1727204521.16016: getting the next task for host managed-node2 46400 1727204521.16024: done getting next task for host managed-node2 46400 1727204521.16028: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204521.16033: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.16047: getting variables 46400 1727204521.16055: in VariableManager get_vars() 46400 1727204521.16087: Calling all_inventory to load vars for managed-node2 46400 1727204521.16089: Calling groups_inventory to load vars for managed-node2 46400 1727204521.16091: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.16100: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.16102: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.16104: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.16861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.17763: done with get_vars() 46400 1727204521.17779: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.030) 0:00:11.462 ***** 46400 1727204521.17839: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204521.17840: Creating lock for fedora.linux_system_roles.network_connections 46400 1727204521.18051: worker is 1 (out of 1 available) 46400 1727204521.18066: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204521.18080: done queuing things up, now waiting for results queue to drain 46400 1727204521.18081: waiting for pending results... 46400 1727204521.18259: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204521.18341: in run() - task 0affcd87-79f5-1303-fda8-00000000021c 46400 1727204521.18354: variable 'ansible_search_path' from source: unknown 46400 1727204521.18358: variable 'ansible_search_path' from source: unknown 46400 1727204521.18389: calling self._execute() 46400 1727204521.18461: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.18470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.18479: variable 'omit' from source: magic vars 46400 1727204521.18747: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.18757: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.18769: variable 'omit' from source: magic vars 46400 1727204521.18814: variable 'omit' from source: magic vars 46400 1727204521.18930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204521.20484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204521.20530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204521.20557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204521.20587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204521.20609: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204521.20663: variable 'network_provider' from source: set_fact 46400 1727204521.20757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204521.20781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204521.20800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204521.20828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204521.20842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204521.20895: variable 'omit' from source: magic vars 46400 1727204521.20972: variable 'omit' from source: magic vars 46400 1727204521.21053: variable 'network_connections' from source: include params 46400 1727204521.21074: variable 'interface' from source: play vars 46400 1727204521.21120: variable 'interface' from source: play vars 46400 1727204521.21230: variable 'omit' from source: magic vars 46400 1727204521.21236: variable '__lsr_ansible_managed' from source: task vars 46400 1727204521.21287: variable '__lsr_ansible_managed' from source: task vars 46400 1727204521.21414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204521.21802: Loaded config def from plugin (lookup/template) 46400 1727204521.21805: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204521.21829: File lookup term: get_ansible_managed.j2 46400 1727204521.21832: variable 'ansible_search_path' from source: unknown 46400 1727204521.21836: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204521.21846: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204521.21859: variable 'ansible_search_path' from source: unknown 46400 1727204521.25109: variable 'ansible_managed' from source: unknown 46400 1727204521.25196: variable 'omit' from source: magic vars 46400 1727204521.25216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204521.25237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204521.25253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204521.25269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204521.25279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204521.25300: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204521.25303: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.25306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.25369: Set connection var ansible_shell_type to sh 46400 1727204521.25380: Set connection var ansible_shell_executable to /bin/sh 46400 1727204521.25383: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204521.25390: Set connection var ansible_connection to ssh 46400 1727204521.25393: Set connection var ansible_pipelining to False 46400 1727204521.25398: Set connection var ansible_timeout to 10 46400 1727204521.25418: variable 'ansible_shell_executable' from source: unknown 46400 1727204521.25421: variable 'ansible_connection' from source: unknown 46400 1727204521.25423: variable 'ansible_module_compression' from source: unknown 46400 1727204521.25425: variable 'ansible_shell_type' from source: unknown 46400 1727204521.25427: variable 'ansible_shell_executable' from source: unknown 46400 1727204521.25430: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.25434: variable 'ansible_pipelining' from source: unknown 46400 1727204521.25436: variable 'ansible_timeout' from source: unknown 46400 1727204521.25442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.25537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204521.25549: variable 'omit' from source: magic vars 46400 1727204521.25554: starting attempt loop 46400 1727204521.25558: running the handler 46400 1727204521.25570: _low_level_execute_command(): starting 46400 1727204521.25577: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204521.26100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.26116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.26143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204521.26155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204521.26173: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.26215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.26230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.26241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.26294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.27942: stdout chunk (state=3): >>>/root <<< 46400 1727204521.28043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.28104: stderr chunk (state=3): >>><<< 46400 1727204521.28107: stdout chunk (state=3): >>><<< 46400 1727204521.28126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204521.28136: _low_level_execute_command(): starting 46400 1727204521.28141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479 `" && echo ansible-tmp-1727204521.281264-47192-233788331344479="` echo /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479 `" ) && sleep 0' 46400 1727204521.28603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.28614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.28646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.28651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204521.28657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204521.28669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.28680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.28735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.28738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.28750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.28801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.30677: stdout chunk (state=3): >>>ansible-tmp-1727204521.281264-47192-233788331344479=/root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479 <<< 46400 1727204521.30796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.30853: stderr chunk (state=3): >>><<< 46400 1727204521.30857: stdout chunk (state=3): >>><<< 46400 1727204521.30881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204521.281264-47192-233788331344479=/root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204521.30920: variable 'ansible_module_compression' from source: unknown 46400 1727204521.30960: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 46400 1727204521.30963: ANSIBALLZ: Acquiring lock 46400 1727204521.30972: ANSIBALLZ: Lock acquired: 140519374238784 46400 1727204521.30977: ANSIBALLZ: Creating module 46400 1727204521.48909: ANSIBALLZ: Writing module into payload 46400 1727204521.49382: ANSIBALLZ: Writing module 46400 1727204521.49413: ANSIBALLZ: Renaming module 46400 1727204521.49418: ANSIBALLZ: Done creating module 46400 1727204521.49442: variable 'ansible_facts' from source: unknown 46400 1727204521.49554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/AnsiballZ_network_connections.py 46400 1727204521.49715: Sending initial data 46400 1727204521.49724: Sent initial data (167 bytes) 46400 1727204521.50726: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204521.50735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.50745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.50759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.50803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.50811: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204521.50821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.50833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204521.50841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204521.50848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204521.50857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.50879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.50891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.50899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.50908: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204521.50918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.50993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.51009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.51013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.51093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.52944: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204521.53009: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204521.53015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpeb1i5q9z /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/AnsiballZ_network_connections.py <<< 46400 1727204521.53046: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204521.54586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.54685: stderr chunk (state=3): >>><<< 46400 1727204521.54689: stdout chunk (state=3): >>><<< 46400 1727204521.54711: done transferring module to remote 46400 1727204521.54722: _low_level_execute_command(): starting 46400 1727204521.54727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/ /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/AnsiballZ_network_connections.py && sleep 0' 46400 1727204521.55398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204521.55406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.55417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.55430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.55475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.55481: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204521.55491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.55504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204521.55512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204521.55518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204521.55526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.55536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.55546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.55553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.55562: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204521.55577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.55651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.55662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.55679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.55745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.57627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.57656: stderr chunk (state=3): >>><<< 46400 1727204521.57660: stdout chunk (state=3): >>><<< 46400 1727204521.57758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204521.57762: _low_level_execute_command(): starting 46400 1727204521.57767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/AnsiballZ_network_connections.py && sleep 0' 46400 1727204521.58348: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204521.58367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.58387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.58405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.58450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.58462: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204521.58481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.58498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204521.58509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204521.58519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204521.58529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.58541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.58558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.58573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204521.58584: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204521.58597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.58677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.58695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.58710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.58800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.84655: stdout chunk (state=3): >>> <<< 46400 1727204521.84663: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204521.87585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204521.87641: stderr chunk (state=3): >>><<< 46400 1727204521.87645: stdout chunk (state=3): >>><<< 46400 1727204521.87661: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204521.87696: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204521.87704: _low_level_execute_command(): starting 46400 1727204521.87708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204521.281264-47192-233788331344479/ > /dev/null 2>&1 && sleep 0' 46400 1727204521.88191: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204521.88195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204521.88229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.88232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204521.88235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204521.88294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204521.88297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204521.88299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204521.88349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204521.90206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204521.90268: stderr chunk (state=3): >>><<< 46400 1727204521.90272: stdout chunk (state=3): >>><<< 46400 1727204521.90286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204521.90292: handler run complete 46400 1727204521.90315: attempt loop complete, returning result 46400 1727204521.90318: _execute() done 46400 1727204521.90320: dumping result to json 46400 1727204521.90325: done dumping result, returning 46400 1727204521.90333: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-00000000021c] 46400 1727204521.90341: sending task result for task 0affcd87-79f5-1303-fda8-00000000021c 46400 1727204521.90442: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021c 46400 1727204521.90447: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87 46400 1727204521.90541: no more pending results, returning what we have 46400 1727204521.90545: results queue empty 46400 1727204521.90546: checking for any_errors_fatal 46400 1727204521.90552: done checking for any_errors_fatal 46400 1727204521.90553: checking for max_fail_percentage 46400 1727204521.90554: done checking for max_fail_percentage 46400 1727204521.90555: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.90556: done checking to see if all hosts have failed 46400 1727204521.90557: getting the remaining hosts for this loop 46400 1727204521.90558: done getting the remaining hosts for this loop 46400 1727204521.90567: getting the next task for host managed-node2 46400 1727204521.90574: done getting next task for host managed-node2 46400 1727204521.90578: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204521.90582: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.90593: getting variables 46400 1727204521.90594: in VariableManager get_vars() 46400 1727204521.90629: Calling all_inventory to load vars for managed-node2 46400 1727204521.90631: Calling groups_inventory to load vars for managed-node2 46400 1727204521.90633: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.90642: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.90644: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.90646: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.91645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.92551: done with get_vars() 46400 1727204521.92576: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.747) 0:00:12.210 ***** 46400 1727204521.92641: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204521.92642: Creating lock for fedora.linux_system_roles.network_state 46400 1727204521.92895: worker is 1 (out of 1 available) 46400 1727204521.92910: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204521.92924: done queuing things up, now waiting for results queue to drain 46400 1727204521.92926: waiting for pending results... 46400 1727204521.93105: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204521.93201: in run() - task 0affcd87-79f5-1303-fda8-00000000021d 46400 1727204521.93214: variable 'ansible_search_path' from source: unknown 46400 1727204521.93217: variable 'ansible_search_path' from source: unknown 46400 1727204521.93248: calling self._execute() 46400 1727204521.93320: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.93324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.93334: variable 'omit' from source: magic vars 46400 1727204521.93609: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.93619: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.93706: variable 'network_state' from source: role '' defaults 46400 1727204521.93715: Evaluated conditional (network_state != {}): False 46400 1727204521.93718: when evaluation is False, skipping this task 46400 1727204521.93720: _execute() done 46400 1727204521.93723: dumping result to json 46400 1727204521.93725: done dumping result, returning 46400 1727204521.93732: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-00000000021d] 46400 1727204521.93738: sending task result for task 0affcd87-79f5-1303-fda8-00000000021d 46400 1727204521.93830: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021d 46400 1727204521.93833: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204521.93888: no more pending results, returning what we have 46400 1727204521.93893: results queue empty 46400 1727204521.93894: checking for any_errors_fatal 46400 1727204521.93908: done checking for any_errors_fatal 46400 1727204521.93908: checking for max_fail_percentage 46400 1727204521.93910: done checking for max_fail_percentage 46400 1727204521.93911: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.93912: done checking to see if all hosts have failed 46400 1727204521.93913: getting the remaining hosts for this loop 46400 1727204521.93914: done getting the remaining hosts for this loop 46400 1727204521.93918: getting the next task for host managed-node2 46400 1727204521.93926: done getting next task for host managed-node2 46400 1727204521.93930: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204521.93936: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.93951: getting variables 46400 1727204521.93952: in VariableManager get_vars() 46400 1727204521.93988: Calling all_inventory to load vars for managed-node2 46400 1727204521.93995: Calling groups_inventory to load vars for managed-node2 46400 1727204521.93997: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.94006: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.94008: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.94010: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.94813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.95729: done with get_vars() 46400 1727204521.95747: done getting variables 46400 1727204521.95797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.031) 0:00:12.242 ***** 46400 1727204521.95824: entering _queue_task() for managed-node2/debug 46400 1727204521.96067: worker is 1 (out of 1 available) 46400 1727204521.96081: exiting _queue_task() for managed-node2/debug 46400 1727204521.96094: done queuing things up, now waiting for results queue to drain 46400 1727204521.96095: waiting for pending results... 46400 1727204521.96278: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204521.96368: in run() - task 0affcd87-79f5-1303-fda8-00000000021e 46400 1727204521.96380: variable 'ansible_search_path' from source: unknown 46400 1727204521.96384: variable 'ansible_search_path' from source: unknown 46400 1727204521.96414: calling self._execute() 46400 1727204521.96486: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.96489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.96501: variable 'omit' from source: magic vars 46400 1727204521.96770: variable 'ansible_distribution_major_version' from source: facts 46400 1727204521.96780: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204521.96786: variable 'omit' from source: magic vars 46400 1727204521.96833: variable 'omit' from source: magic vars 46400 1727204521.96858: variable 'omit' from source: magic vars 46400 1727204521.96894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204521.96920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204521.96938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204521.96952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204521.96965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204521.96987: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204521.96990: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.96992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.97057: Set connection var ansible_shell_type to sh 46400 1727204521.97067: Set connection var ansible_shell_executable to /bin/sh 46400 1727204521.97072: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204521.97079: Set connection var ansible_connection to ssh 46400 1727204521.97085: Set connection var ansible_pipelining to False 46400 1727204521.97090: Set connection var ansible_timeout to 10 46400 1727204521.97109: variable 'ansible_shell_executable' from source: unknown 46400 1727204521.97112: variable 'ansible_connection' from source: unknown 46400 1727204521.97114: variable 'ansible_module_compression' from source: unknown 46400 1727204521.97117: variable 'ansible_shell_type' from source: unknown 46400 1727204521.97119: variable 'ansible_shell_executable' from source: unknown 46400 1727204521.97121: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204521.97125: variable 'ansible_pipelining' from source: unknown 46400 1727204521.97127: variable 'ansible_timeout' from source: unknown 46400 1727204521.97131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204521.97234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204521.97244: variable 'omit' from source: magic vars 46400 1727204521.97249: starting attempt loop 46400 1727204521.97252: running the handler 46400 1727204521.97347: variable '__network_connections_result' from source: set_fact 46400 1727204521.97392: handler run complete 46400 1727204521.97405: attempt loop complete, returning result 46400 1727204521.97408: _execute() done 46400 1727204521.97411: dumping result to json 46400 1727204521.97413: done dumping result, returning 46400 1727204521.97420: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-00000000021e] 46400 1727204521.97426: sending task result for task 0affcd87-79f5-1303-fda8-00000000021e 46400 1727204521.97511: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021e 46400 1727204521.97515: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87" ] } 46400 1727204521.97577: no more pending results, returning what we have 46400 1727204521.97581: results queue empty 46400 1727204521.97582: checking for any_errors_fatal 46400 1727204521.97588: done checking for any_errors_fatal 46400 1727204521.97589: checking for max_fail_percentage 46400 1727204521.97594: done checking for max_fail_percentage 46400 1727204521.97595: checking to see if all hosts have failed and the running result is not ok 46400 1727204521.97596: done checking to see if all hosts have failed 46400 1727204521.97597: getting the remaining hosts for this loop 46400 1727204521.97598: done getting the remaining hosts for this loop 46400 1727204521.97602: getting the next task for host managed-node2 46400 1727204521.97611: done getting next task for host managed-node2 46400 1727204521.97615: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204521.97620: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204521.97631: getting variables 46400 1727204521.97633: in VariableManager get_vars() 46400 1727204521.97671: Calling all_inventory to load vars for managed-node2 46400 1727204521.97673: Calling groups_inventory to load vars for managed-node2 46400 1727204521.97676: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204521.97685: Calling all_plugins_play to load vars for managed-node2 46400 1727204521.97687: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204521.97689: Calling groups_plugins_play to load vars for managed-node2 46400 1727204521.98611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204521.99511: done with get_vars() 46400 1727204521.99530: done getting variables 46400 1727204521.99581: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:01 -0400 (0:00:00.037) 0:00:12.280 ***** 46400 1727204521.99611: entering _queue_task() for managed-node2/debug 46400 1727204521.99847: worker is 1 (out of 1 available) 46400 1727204521.99866: exiting _queue_task() for managed-node2/debug 46400 1727204521.99879: done queuing things up, now waiting for results queue to drain 46400 1727204521.99881: waiting for pending results... 46400 1727204522.00056: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204522.00151: in run() - task 0affcd87-79f5-1303-fda8-00000000021f 46400 1727204522.00168: variable 'ansible_search_path' from source: unknown 46400 1727204522.00172: variable 'ansible_search_path' from source: unknown 46400 1727204522.00197: calling self._execute() 46400 1727204522.00281: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.00284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.00293: variable 'omit' from source: magic vars 46400 1727204522.00573: variable 'ansible_distribution_major_version' from source: facts 46400 1727204522.00583: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204522.00590: variable 'omit' from source: magic vars 46400 1727204522.00634: variable 'omit' from source: magic vars 46400 1727204522.00658: variable 'omit' from source: magic vars 46400 1727204522.00696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204522.00722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204522.00742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204522.00754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.00768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.00791: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204522.00794: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.00796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.00863: Set connection var ansible_shell_type to sh 46400 1727204522.00879: Set connection var ansible_shell_executable to /bin/sh 46400 1727204522.00884: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204522.00889: Set connection var ansible_connection to ssh 46400 1727204522.00894: Set connection var ansible_pipelining to False 46400 1727204522.00899: Set connection var ansible_timeout to 10 46400 1727204522.00918: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.00920: variable 'ansible_connection' from source: unknown 46400 1727204522.00923: variable 'ansible_module_compression' from source: unknown 46400 1727204522.00925: variable 'ansible_shell_type' from source: unknown 46400 1727204522.00928: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.00930: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.00932: variable 'ansible_pipelining' from source: unknown 46400 1727204522.00935: variable 'ansible_timeout' from source: unknown 46400 1727204522.00939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.01042: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204522.01050: variable 'omit' from source: magic vars 46400 1727204522.01055: starting attempt loop 46400 1727204522.01058: running the handler 46400 1727204522.01100: variable '__network_connections_result' from source: set_fact 46400 1727204522.01155: variable '__network_connections_result' from source: set_fact 46400 1727204522.01236: handler run complete 46400 1727204522.01254: attempt loop complete, returning result 46400 1727204522.01257: _execute() done 46400 1727204522.01262: dumping result to json 46400 1727204522.01267: done dumping result, returning 46400 1727204522.01273: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-00000000021f] 46400 1727204522.01279: sending task result for task 0affcd87-79f5-1303-fda8-00000000021f 46400 1727204522.01370: done sending task result for task 0affcd87-79f5-1303-fda8-00000000021f 46400 1727204522.01373: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87" ] } } 46400 1727204522.01484: no more pending results, returning what we have 46400 1727204522.01487: results queue empty 46400 1727204522.01488: checking for any_errors_fatal 46400 1727204522.01493: done checking for any_errors_fatal 46400 1727204522.01494: checking for max_fail_percentage 46400 1727204522.01495: done checking for max_fail_percentage 46400 1727204522.01496: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.01497: done checking to see if all hosts have failed 46400 1727204522.01497: getting the remaining hosts for this loop 46400 1727204522.01502: done getting the remaining hosts for this loop 46400 1727204522.01506: getting the next task for host managed-node2 46400 1727204522.01513: done getting next task for host managed-node2 46400 1727204522.01518: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204522.01521: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.01531: getting variables 46400 1727204522.01532: in VariableManager get_vars() 46400 1727204522.01573: Calling all_inventory to load vars for managed-node2 46400 1727204522.01575: Calling groups_inventory to load vars for managed-node2 46400 1727204522.01578: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.01586: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.01588: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.01590: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.02370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.03281: done with get_vars() 46400 1727204522.03299: done getting variables 46400 1727204522.03363: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.037) 0:00:12.318 ***** 46400 1727204522.03391: entering _queue_task() for managed-node2/debug 46400 1727204522.03631: worker is 1 (out of 1 available) 46400 1727204522.03646: exiting _queue_task() for managed-node2/debug 46400 1727204522.03662: done queuing things up, now waiting for results queue to drain 46400 1727204522.03665: waiting for pending results... 46400 1727204522.03839: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204522.03928: in run() - task 0affcd87-79f5-1303-fda8-000000000220 46400 1727204522.03939: variable 'ansible_search_path' from source: unknown 46400 1727204522.03943: variable 'ansible_search_path' from source: unknown 46400 1727204522.03974: calling self._execute() 46400 1727204522.04048: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.04053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.04065: variable 'omit' from source: magic vars 46400 1727204522.04333: variable 'ansible_distribution_major_version' from source: facts 46400 1727204522.04344: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204522.04433: variable 'network_state' from source: role '' defaults 46400 1727204522.04442: Evaluated conditional (network_state != {}): False 46400 1727204522.04445: when evaluation is False, skipping this task 46400 1727204522.04448: _execute() done 46400 1727204522.04450: dumping result to json 46400 1727204522.04453: done dumping result, returning 46400 1727204522.04459: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000000220] 46400 1727204522.04466: sending task result for task 0affcd87-79f5-1303-fda8-000000000220 46400 1727204522.04550: done sending task result for task 0affcd87-79f5-1303-fda8-000000000220 46400 1727204522.04553: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204522.04602: no more pending results, returning what we have 46400 1727204522.04606: results queue empty 46400 1727204522.04607: checking for any_errors_fatal 46400 1727204522.04618: done checking for any_errors_fatal 46400 1727204522.04619: checking for max_fail_percentage 46400 1727204522.04621: done checking for max_fail_percentage 46400 1727204522.04622: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.04622: done checking to see if all hosts have failed 46400 1727204522.04623: getting the remaining hosts for this loop 46400 1727204522.04625: done getting the remaining hosts for this loop 46400 1727204522.04628: getting the next task for host managed-node2 46400 1727204522.04637: done getting next task for host managed-node2 46400 1727204522.04641: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204522.04647: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.04668: getting variables 46400 1727204522.04670: in VariableManager get_vars() 46400 1727204522.04705: Calling all_inventory to load vars for managed-node2 46400 1727204522.04707: Calling groups_inventory to load vars for managed-node2 46400 1727204522.04709: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.04718: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.04720: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.04722: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.05613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.06515: done with get_vars() 46400 1727204522.06532: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.032) 0:00:12.350 ***** 46400 1727204522.06613: entering _queue_task() for managed-node2/ping 46400 1727204522.06615: Creating lock for ping 46400 1727204522.06855: worker is 1 (out of 1 available) 46400 1727204522.06873: exiting _queue_task() for managed-node2/ping 46400 1727204522.06887: done queuing things up, now waiting for results queue to drain 46400 1727204522.06888: waiting for pending results... 46400 1727204522.07070: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204522.07158: in run() - task 0affcd87-79f5-1303-fda8-000000000221 46400 1727204522.07171: variable 'ansible_search_path' from source: unknown 46400 1727204522.07175: variable 'ansible_search_path' from source: unknown 46400 1727204522.07205: calling self._execute() 46400 1727204522.07280: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.07283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.07296: variable 'omit' from source: magic vars 46400 1727204522.07567: variable 'ansible_distribution_major_version' from source: facts 46400 1727204522.07576: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204522.07582: variable 'omit' from source: magic vars 46400 1727204522.07625: variable 'omit' from source: magic vars 46400 1727204522.07648: variable 'omit' from source: magic vars 46400 1727204522.07686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204522.07714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204522.07731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204522.07746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.07755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.07782: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204522.07785: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.07788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.07854: Set connection var ansible_shell_type to sh 46400 1727204522.07866: Set connection var ansible_shell_executable to /bin/sh 46400 1727204522.07869: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204522.07873: Set connection var ansible_connection to ssh 46400 1727204522.07881: Set connection var ansible_pipelining to False 46400 1727204522.07886: Set connection var ansible_timeout to 10 46400 1727204522.07904: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.07907: variable 'ansible_connection' from source: unknown 46400 1727204522.07910: variable 'ansible_module_compression' from source: unknown 46400 1727204522.07912: variable 'ansible_shell_type' from source: unknown 46400 1727204522.07914: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.07916: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.07920: variable 'ansible_pipelining' from source: unknown 46400 1727204522.07922: variable 'ansible_timeout' from source: unknown 46400 1727204522.07927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.08089: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204522.08099: variable 'omit' from source: magic vars 46400 1727204522.08103: starting attempt loop 46400 1727204522.08106: running the handler 46400 1727204522.08118: _low_level_execute_command(): starting 46400 1727204522.08124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204522.08647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.08668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.08682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204522.08696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.08743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.08758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.08816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.10481: stdout chunk (state=3): >>>/root <<< 46400 1727204522.10585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204522.10642: stderr chunk (state=3): >>><<< 46400 1727204522.10649: stdout chunk (state=3): >>><<< 46400 1727204522.10678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204522.10692: _low_level_execute_command(): starting 46400 1727204522.10698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398 `" && echo ansible-tmp-1727204522.106798-47264-225628210781398="` echo /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398 `" ) && sleep 0' 46400 1727204522.11341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204522.11357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204522.11379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.11399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.11448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204522.11468: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204522.11493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.11512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204522.11529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204522.11541: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204522.11556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204522.11578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.11595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.11609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204522.11621: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204522.11642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.11723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.11750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204522.11773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.11883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.13744: stdout chunk (state=3): >>>ansible-tmp-1727204522.106798-47264-225628210781398=/root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398 <<< 46400 1727204522.13948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204522.13952: stdout chunk (state=3): >>><<< 46400 1727204522.13954: stderr chunk (state=3): >>><<< 46400 1727204522.14175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204522.106798-47264-225628210781398=/root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204522.14179: variable 'ansible_module_compression' from source: unknown 46400 1727204522.14182: ANSIBALLZ: Using lock for ping 46400 1727204522.14184: ANSIBALLZ: Acquiring lock 46400 1727204522.14186: ANSIBALLZ: Lock acquired: 140519368508960 46400 1727204522.14188: ANSIBALLZ: Creating module 46400 1727204522.22788: ANSIBALLZ: Writing module into payload 46400 1727204522.22832: ANSIBALLZ: Writing module 46400 1727204522.22854: ANSIBALLZ: Renaming module 46400 1727204522.22858: ANSIBALLZ: Done creating module 46400 1727204522.22875: variable 'ansible_facts' from source: unknown 46400 1727204522.22917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/AnsiballZ_ping.py 46400 1727204522.23031: Sending initial data 46400 1727204522.23035: Sent initial data (152 bytes) 46400 1727204522.23746: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204522.23752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.23790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.23803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.23852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.23866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.23924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.25808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204522.25841: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204522.25878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpr5sw6p77 /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/AnsiballZ_ping.py <<< 46400 1727204522.25911: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204522.26682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204522.26806: stderr chunk (state=3): >>><<< 46400 1727204522.26809: stdout chunk (state=3): >>><<< 46400 1727204522.26829: done transferring module to remote 46400 1727204522.26841: _low_level_execute_command(): starting 46400 1727204522.26844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/ /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/AnsiballZ_ping.py && sleep 0' 46400 1727204522.27305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204522.27310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.27355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.27359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.27371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.27426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.27429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204522.27435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.27476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.29258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204522.29315: stderr chunk (state=3): >>><<< 46400 1727204522.29318: stdout chunk (state=3): >>><<< 46400 1727204522.29333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204522.29336: _low_level_execute_command(): starting 46400 1727204522.29341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/AnsiballZ_ping.py && sleep 0' 46400 1727204522.29803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204522.29809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.29840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204522.29862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.29876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.29915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.29927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.29984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.43186: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204522.44261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204522.44272: stderr chunk (state=3): >>><<< 46400 1727204522.44275: stdout chunk (state=3): >>><<< 46400 1727204522.44293: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204522.44318: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204522.44328: _low_level_execute_command(): starting 46400 1727204522.44334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204522.106798-47264-225628210781398/ > /dev/null 2>&1 && sleep 0' 46400 1727204522.46215: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.46269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.46277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204522.46300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204522.46304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204522.46542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204522.46586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204522.46616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204522.46745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204522.48694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204522.48698: stdout chunk (state=3): >>><<< 46400 1727204522.48701: stderr chunk (state=3): >>><<< 46400 1727204522.48773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204522.48777: handler run complete 46400 1727204522.48779: attempt loop complete, returning result 46400 1727204522.48781: _execute() done 46400 1727204522.48783: dumping result to json 46400 1727204522.48785: done dumping result, returning 46400 1727204522.48787: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000000221] 46400 1727204522.48789: sending task result for task 0affcd87-79f5-1303-fda8-000000000221 46400 1727204522.49162: done sending task result for task 0affcd87-79f5-1303-fda8-000000000221 46400 1727204522.49167: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204522.49257: no more pending results, returning what we have 46400 1727204522.49261: results queue empty 46400 1727204522.49262: checking for any_errors_fatal 46400 1727204522.49270: done checking for any_errors_fatal 46400 1727204522.49271: checking for max_fail_percentage 46400 1727204522.49272: done checking for max_fail_percentage 46400 1727204522.49273: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.49274: done checking to see if all hosts have failed 46400 1727204522.49275: getting the remaining hosts for this loop 46400 1727204522.49276: done getting the remaining hosts for this loop 46400 1727204522.49280: getting the next task for host managed-node2 46400 1727204522.49290: done getting next task for host managed-node2 46400 1727204522.49292: ^ task is: TASK: meta (role_complete) 46400 1727204522.49298: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.49308: getting variables 46400 1727204522.49310: in VariableManager get_vars() 46400 1727204522.49353: Calling all_inventory to load vars for managed-node2 46400 1727204522.49356: Calling groups_inventory to load vars for managed-node2 46400 1727204522.49359: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.49371: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.49374: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.49377: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.51411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.55405: done with get_vars() 46400 1727204522.55433: done getting variables 46400 1727204522.55525: done queuing things up, now waiting for results queue to drain 46400 1727204522.55527: results queue empty 46400 1727204522.55528: checking for any_errors_fatal 46400 1727204522.55531: done checking for any_errors_fatal 46400 1727204522.55532: checking for max_fail_percentage 46400 1727204522.55533: done checking for max_fail_percentage 46400 1727204522.55534: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.55535: done checking to see if all hosts have failed 46400 1727204522.55535: getting the remaining hosts for this loop 46400 1727204522.55536: done getting the remaining hosts for this loop 46400 1727204522.55539: getting the next task for host managed-node2 46400 1727204522.55544: done getting next task for host managed-node2 46400 1727204522.55547: ^ task is: TASK: Show result 46400 1727204522.55549: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.55552: getting variables 46400 1727204522.55553: in VariableManager get_vars() 46400 1727204522.55567: Calling all_inventory to load vars for managed-node2 46400 1727204522.55570: Calling groups_inventory to load vars for managed-node2 46400 1727204522.55572: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.55578: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.55580: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.55583: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.58329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.62212: done with get_vars() 46400 1727204522.62243: done getting variables 46400 1727204522.62292: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.557) 0:00:12.907 ***** 46400 1727204522.62328: entering _queue_task() for managed-node2/debug 46400 1727204522.62644: worker is 1 (out of 1 available) 46400 1727204522.62659: exiting _queue_task() for managed-node2/debug 46400 1727204522.63552: done queuing things up, now waiting for results queue to drain 46400 1727204522.63557: waiting for pending results... 46400 1727204522.63580: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204522.64117: in run() - task 0affcd87-79f5-1303-fda8-00000000018f 46400 1727204522.64141: variable 'ansible_search_path' from source: unknown 46400 1727204522.64150: variable 'ansible_search_path' from source: unknown 46400 1727204522.64201: calling self._execute() 46400 1727204522.64422: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.64435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.64451: variable 'omit' from source: magic vars 46400 1727204522.65296: variable 'ansible_distribution_major_version' from source: facts 46400 1727204522.65318: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204522.65330: variable 'omit' from source: magic vars 46400 1727204522.65439: variable 'omit' from source: magic vars 46400 1727204522.65640: variable 'omit' from source: magic vars 46400 1727204522.65690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204522.65847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204522.65879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204522.65902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.65918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204522.65954: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204522.66043: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.66053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.66274: Set connection var ansible_shell_type to sh 46400 1727204522.66290: Set connection var ansible_shell_executable to /bin/sh 46400 1727204522.66301: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204522.66309: Set connection var ansible_connection to ssh 46400 1727204522.66318: Set connection var ansible_pipelining to False 46400 1727204522.66327: Set connection var ansible_timeout to 10 46400 1727204522.66470: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.66481: variable 'ansible_connection' from source: unknown 46400 1727204522.66489: variable 'ansible_module_compression' from source: unknown 46400 1727204522.66495: variable 'ansible_shell_type' from source: unknown 46400 1727204522.66501: variable 'ansible_shell_executable' from source: unknown 46400 1727204522.66507: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.66514: variable 'ansible_pipelining' from source: unknown 46400 1727204522.66519: variable 'ansible_timeout' from source: unknown 46400 1727204522.66526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.66672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204522.66811: variable 'omit' from source: magic vars 46400 1727204522.66820: starting attempt loop 46400 1727204522.66826: running the handler 46400 1727204522.66876: variable '__network_connections_result' from source: set_fact 46400 1727204522.67104: variable '__network_connections_result' from source: set_fact 46400 1727204522.67470: handler run complete 46400 1727204522.67503: attempt loop complete, returning result 46400 1727204522.67510: _execute() done 46400 1727204522.67516: dumping result to json 46400 1727204522.67524: done dumping result, returning 46400 1727204522.67534: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-00000000018f] 46400 1727204522.67544: sending task result for task 0affcd87-79f5-1303-fda8-00000000018f ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 994c2922-44e9-4ac5-9912-c2f948bcac87" ] } } 46400 1727204522.67742: no more pending results, returning what we have 46400 1727204522.67746: results queue empty 46400 1727204522.67748: checking for any_errors_fatal 46400 1727204522.67749: done checking for any_errors_fatal 46400 1727204522.67750: checking for max_fail_percentage 46400 1727204522.67752: done checking for max_fail_percentage 46400 1727204522.67753: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.67754: done checking to see if all hosts have failed 46400 1727204522.67755: getting the remaining hosts for this loop 46400 1727204522.67757: done getting the remaining hosts for this loop 46400 1727204522.67761: getting the next task for host managed-node2 46400 1727204522.67773: done getting next task for host managed-node2 46400 1727204522.67776: ^ task is: TASK: Asserts 46400 1727204522.67778: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.67782: getting variables 46400 1727204522.67783: in VariableManager get_vars() 46400 1727204522.67817: Calling all_inventory to load vars for managed-node2 46400 1727204522.67820: Calling groups_inventory to load vars for managed-node2 46400 1727204522.67824: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.67837: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.67839: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.67842: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.69496: done sending task result for task 0affcd87-79f5-1303-fda8-00000000018f 46400 1727204522.69500: WORKER PROCESS EXITING 46400 1727204522.71232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.74454: done with get_vars() 46400 1727204522.74489: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.122) 0:00:13.030 ***** 46400 1727204522.74592: entering _queue_task() for managed-node2/include_tasks 46400 1727204522.75305: worker is 1 (out of 1 available) 46400 1727204522.75317: exiting _queue_task() for managed-node2/include_tasks 46400 1727204522.75331: done queuing things up, now waiting for results queue to drain 46400 1727204522.75333: waiting for pending results... 46400 1727204522.75951: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204522.76227: in run() - task 0affcd87-79f5-1303-fda8-000000000096 46400 1727204522.76251: variable 'ansible_search_path' from source: unknown 46400 1727204522.76259: variable 'ansible_search_path' from source: unknown 46400 1727204522.76314: variable 'lsr_assert' from source: include params 46400 1727204522.76813: variable 'lsr_assert' from source: include params 46400 1727204522.76896: variable 'omit' from source: magic vars 46400 1727204522.77272: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204522.77292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204522.77340: variable 'omit' from source: magic vars 46400 1727204522.77854: variable 'ansible_distribution_major_version' from source: facts 46400 1727204522.77994: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204522.78006: variable 'item' from source: unknown 46400 1727204522.78088: variable 'item' from source: unknown 46400 1727204522.78240: variable 'item' from source: unknown 46400 1727204522.78428: variable 'item' from source: unknown 46400 1727204522.78595: dumping result to json 46400 1727204522.78604: done dumping result, returning 46400 1727204522.78615: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-000000000096] 46400 1727204522.78626: sending task result for task 0affcd87-79f5-1303-fda8-000000000096 46400 1727204522.78724: no more pending results, returning what we have 46400 1727204522.78730: in VariableManager get_vars() 46400 1727204522.78774: Calling all_inventory to load vars for managed-node2 46400 1727204522.78777: Calling groups_inventory to load vars for managed-node2 46400 1727204522.78782: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.78799: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.78802: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.78805: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.79380: done sending task result for task 0affcd87-79f5-1303-fda8-000000000096 46400 1727204522.79384: WORKER PROCESS EXITING 46400 1727204522.81410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.84902: done with get_vars() 46400 1727204522.84935: variable 'ansible_search_path' from source: unknown 46400 1727204522.84937: variable 'ansible_search_path' from source: unknown 46400 1727204522.84983: we have included files to process 46400 1727204522.84985: generating all_blocks data 46400 1727204522.84987: done generating all_blocks data 46400 1727204522.84993: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204522.84995: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204522.84997: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204522.85201: in VariableManager get_vars() 46400 1727204522.85221: done with get_vars() 46400 1727204522.85868: done processing included file 46400 1727204522.85870: iterating over new_blocks loaded from include file 46400 1727204522.85871: in VariableManager get_vars() 46400 1727204522.85885: done with get_vars() 46400 1727204522.85887: filtering new block on tags 46400 1727204522.85939: done filtering new block on tags 46400 1727204522.85941: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 46400 1727204522.85946: extending task lists for all hosts with included blocks 46400 1727204522.90308: done extending task lists 46400 1727204522.90311: done processing included files 46400 1727204522.90311: results queue empty 46400 1727204522.90312: checking for any_errors_fatal 46400 1727204522.90317: done checking for any_errors_fatal 46400 1727204522.90318: checking for max_fail_percentage 46400 1727204522.90320: done checking for max_fail_percentage 46400 1727204522.90320: checking to see if all hosts have failed and the running result is not ok 46400 1727204522.90321: done checking to see if all hosts have failed 46400 1727204522.90322: getting the remaining hosts for this loop 46400 1727204522.90324: done getting the remaining hosts for this loop 46400 1727204522.90326: getting the next task for host managed-node2 46400 1727204522.90331: done getting next task for host managed-node2 46400 1727204522.90334: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204522.90337: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204522.90340: getting variables 46400 1727204522.90341: in VariableManager get_vars() 46400 1727204522.90354: Calling all_inventory to load vars for managed-node2 46400 1727204522.90357: Calling groups_inventory to load vars for managed-node2 46400 1727204522.90359: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204522.90369: Calling all_plugins_play to load vars for managed-node2 46400 1727204522.90371: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204522.90375: Calling groups_plugins_play to load vars for managed-node2 46400 1727204522.93607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204522.98261: done with get_vars() 46400 1727204522.98298: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.239) 0:00:13.269 ***** 46400 1727204522.98503: entering _queue_task() for managed-node2/include_tasks 46400 1727204522.99297: worker is 1 (out of 1 available) 46400 1727204522.99310: exiting _queue_task() for managed-node2/include_tasks 46400 1727204522.99323: done queuing things up, now waiting for results queue to drain 46400 1727204522.99325: waiting for pending results... 46400 1727204523.00455: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204523.01201: in run() - task 0affcd87-79f5-1303-fda8-000000000383 46400 1727204523.01488: variable 'ansible_search_path' from source: unknown 46400 1727204523.01496: variable 'ansible_search_path' from source: unknown 46400 1727204523.01537: calling self._execute() 46400 1727204523.01634: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.01649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.01683: variable 'omit' from source: magic vars 46400 1727204523.02337: variable 'ansible_distribution_major_version' from source: facts 46400 1727204523.02587: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204523.02600: _execute() done 46400 1727204523.02607: dumping result to json 46400 1727204523.02614: done dumping result, returning 46400 1727204523.02623: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-000000000383] 46400 1727204523.02633: sending task result for task 0affcd87-79f5-1303-fda8-000000000383 46400 1727204523.02770: no more pending results, returning what we have 46400 1727204523.02776: in VariableManager get_vars() 46400 1727204523.02817: Calling all_inventory to load vars for managed-node2 46400 1727204523.02820: Calling groups_inventory to load vars for managed-node2 46400 1727204523.02823: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204523.02840: Calling all_plugins_play to load vars for managed-node2 46400 1727204523.02844: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204523.02848: Calling groups_plugins_play to load vars for managed-node2 46400 1727204523.03872: done sending task result for task 0affcd87-79f5-1303-fda8-000000000383 46400 1727204523.03876: WORKER PROCESS EXITING 46400 1727204523.06146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204523.10548: done with get_vars() 46400 1727204523.10888: variable 'ansible_search_path' from source: unknown 46400 1727204523.10890: variable 'ansible_search_path' from source: unknown 46400 1727204523.10901: variable 'item' from source: include params 46400 1727204523.11024: variable 'item' from source: include params 46400 1727204523.11061: we have included files to process 46400 1727204523.11062: generating all_blocks data 46400 1727204523.11065: done generating all_blocks data 46400 1727204523.11067: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204523.11068: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204523.11070: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204523.13449: done processing included file 46400 1727204523.13452: iterating over new_blocks loaded from include file 46400 1727204523.13453: in VariableManager get_vars() 46400 1727204523.13473: done with get_vars() 46400 1727204523.13475: filtering new block on tags 46400 1727204523.13922: done filtering new block on tags 46400 1727204523.13926: in VariableManager get_vars() 46400 1727204523.13946: done with get_vars() 46400 1727204523.13948: filtering new block on tags 46400 1727204523.14092: done filtering new block on tags 46400 1727204523.14094: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204523.14100: extending task lists for all hosts with included blocks 46400 1727204523.15716: done extending task lists 46400 1727204523.15718: done processing included files 46400 1727204523.15719: results queue empty 46400 1727204523.15720: checking for any_errors_fatal 46400 1727204523.15723: done checking for any_errors_fatal 46400 1727204523.15724: checking for max_fail_percentage 46400 1727204523.15725: done checking for max_fail_percentage 46400 1727204523.15726: checking to see if all hosts have failed and the running result is not ok 46400 1727204523.15727: done checking to see if all hosts have failed 46400 1727204523.15728: getting the remaining hosts for this loop 46400 1727204523.15729: done getting the remaining hosts for this loop 46400 1727204523.15732: getting the next task for host managed-node2 46400 1727204523.15737: done getting next task for host managed-node2 46400 1727204523.15739: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204523.15743: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204523.15745: getting variables 46400 1727204523.15746: in VariableManager get_vars() 46400 1727204523.15759: Calling all_inventory to load vars for managed-node2 46400 1727204523.15762: Calling groups_inventory to load vars for managed-node2 46400 1727204523.15883: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204523.15892: Calling all_plugins_play to load vars for managed-node2 46400 1727204523.15894: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204523.15897: Calling groups_plugins_play to load vars for managed-node2 46400 1727204523.18668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204523.21549: done with get_vars() 46400 1727204523.21582: done getting variables 46400 1727204523.21631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:02:03 -0400 (0:00:00.231) 0:00:13.501 ***** 46400 1727204523.21669: entering _queue_task() for managed-node2/set_fact 46400 1727204523.22518: worker is 1 (out of 1 available) 46400 1727204523.22529: exiting _queue_task() for managed-node2/set_fact 46400 1727204523.22543: done queuing things up, now waiting for results queue to drain 46400 1727204523.22544: waiting for pending results... 46400 1727204523.23358: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204523.24506: in run() - task 0affcd87-79f5-1303-fda8-0000000003fe 46400 1727204523.24530: variable 'ansible_search_path' from source: unknown 46400 1727204523.24537: variable 'ansible_search_path' from source: unknown 46400 1727204523.24584: calling self._execute() 46400 1727204523.24686: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.24699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.24714: variable 'omit' from source: magic vars 46400 1727204523.25086: variable 'ansible_distribution_major_version' from source: facts 46400 1727204523.25725: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204523.25731: variable 'omit' from source: magic vars 46400 1727204523.25817: variable 'omit' from source: magic vars 46400 1727204523.25875: variable 'omit' from source: magic vars 46400 1727204523.25918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204523.25953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204523.25988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204523.26007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204523.26018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204523.26050: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204523.26053: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.26056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.26157: Set connection var ansible_shell_type to sh 46400 1727204523.26172: Set connection var ansible_shell_executable to /bin/sh 46400 1727204523.26178: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204523.26190: Set connection var ansible_connection to ssh 46400 1727204523.26195: Set connection var ansible_pipelining to False 46400 1727204523.26201: Set connection var ansible_timeout to 10 46400 1727204523.26227: variable 'ansible_shell_executable' from source: unknown 46400 1727204523.26231: variable 'ansible_connection' from source: unknown 46400 1727204523.26234: variable 'ansible_module_compression' from source: unknown 46400 1727204523.26236: variable 'ansible_shell_type' from source: unknown 46400 1727204523.26239: variable 'ansible_shell_executable' from source: unknown 46400 1727204523.26241: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.26243: variable 'ansible_pipelining' from source: unknown 46400 1727204523.26245: variable 'ansible_timeout' from source: unknown 46400 1727204523.26251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.26410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204523.26420: variable 'omit' from source: magic vars 46400 1727204523.26426: starting attempt loop 46400 1727204523.26429: running the handler 46400 1727204523.26447: handler run complete 46400 1727204523.26457: attempt loop complete, returning result 46400 1727204523.26462: _execute() done 46400 1727204523.26467: dumping result to json 46400 1727204523.26470: done dumping result, returning 46400 1727204523.26472: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-0000000003fe] 46400 1727204523.26479: sending task result for task 0affcd87-79f5-1303-fda8-0000000003fe 46400 1727204523.26574: done sending task result for task 0affcd87-79f5-1303-fda8-0000000003fe 46400 1727204523.26577: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204523.26629: no more pending results, returning what we have 46400 1727204523.26633: results queue empty 46400 1727204523.26634: checking for any_errors_fatal 46400 1727204523.26637: done checking for any_errors_fatal 46400 1727204523.26637: checking for max_fail_percentage 46400 1727204523.26639: done checking for max_fail_percentage 46400 1727204523.26640: checking to see if all hosts have failed and the running result is not ok 46400 1727204523.26641: done checking to see if all hosts have failed 46400 1727204523.26641: getting the remaining hosts for this loop 46400 1727204523.26643: done getting the remaining hosts for this loop 46400 1727204523.26647: getting the next task for host managed-node2 46400 1727204523.26657: done getting next task for host managed-node2 46400 1727204523.26663: ^ task is: TASK: Stat profile file 46400 1727204523.26670: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204523.26675: getting variables 46400 1727204523.26676: in VariableManager get_vars() 46400 1727204523.26709: Calling all_inventory to load vars for managed-node2 46400 1727204523.26711: Calling groups_inventory to load vars for managed-node2 46400 1727204523.26715: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204523.26726: Calling all_plugins_play to load vars for managed-node2 46400 1727204523.26728: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204523.26731: Calling groups_plugins_play to load vars for managed-node2 46400 1727204523.38570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204523.41758: done with get_vars() 46400 1727204523.41791: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:02:03 -0400 (0:00:00.202) 0:00:13.703 ***** 46400 1727204523.41879: entering _queue_task() for managed-node2/stat 46400 1727204523.42802: worker is 1 (out of 1 available) 46400 1727204523.42814: exiting _queue_task() for managed-node2/stat 46400 1727204523.42826: done queuing things up, now waiting for results queue to drain 46400 1727204523.42828: waiting for pending results... 46400 1727204523.43611: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204523.43765: in run() - task 0affcd87-79f5-1303-fda8-0000000003ff 46400 1727204523.43791: variable 'ansible_search_path' from source: unknown 46400 1727204523.43804: variable 'ansible_search_path' from source: unknown 46400 1727204523.43845: calling self._execute() 46400 1727204523.43950: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.43967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.43981: variable 'omit' from source: magic vars 46400 1727204523.44379: variable 'ansible_distribution_major_version' from source: facts 46400 1727204523.44397: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204523.44408: variable 'omit' from source: magic vars 46400 1727204523.44474: variable 'omit' from source: magic vars 46400 1727204523.44582: variable 'profile' from source: play vars 46400 1727204523.44593: variable 'interface' from source: play vars 46400 1727204523.44668: variable 'interface' from source: play vars 46400 1727204523.44700: variable 'omit' from source: magic vars 46400 1727204523.44746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204523.44796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204523.44822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204523.44844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204523.44863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204523.44903: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204523.44912: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.44921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.45179: Set connection var ansible_shell_type to sh 46400 1727204523.45195: Set connection var ansible_shell_executable to /bin/sh 46400 1727204523.45206: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204523.45218: Set connection var ansible_connection to ssh 46400 1727204523.45227: Set connection var ansible_pipelining to False 46400 1727204523.45241: Set connection var ansible_timeout to 10 46400 1727204523.45275: variable 'ansible_shell_executable' from source: unknown 46400 1727204523.45284: variable 'ansible_connection' from source: unknown 46400 1727204523.45290: variable 'ansible_module_compression' from source: unknown 46400 1727204523.45297: variable 'ansible_shell_type' from source: unknown 46400 1727204523.45303: variable 'ansible_shell_executable' from source: unknown 46400 1727204523.45309: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.45316: variable 'ansible_pipelining' from source: unknown 46400 1727204523.45324: variable 'ansible_timeout' from source: unknown 46400 1727204523.45331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.45544: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204523.45684: variable 'omit' from source: magic vars 46400 1727204523.45696: starting attempt loop 46400 1727204523.45704: running the handler 46400 1727204523.45723: _low_level_execute_command(): starting 46400 1727204523.45735: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204523.47636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204523.47658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.47678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.47697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.47739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.47757: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204523.47777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.47795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.47807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204523.47817: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204523.47829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.47843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.47862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.47882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.47894: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204523.47908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.47995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.48017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.48035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.48121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.49788: stdout chunk (state=3): >>>/root <<< 46400 1727204523.49986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204523.49990: stdout chunk (state=3): >>><<< 46400 1727204523.49992: stderr chunk (state=3): >>><<< 46400 1727204523.50074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204523.50078: _low_level_execute_command(): starting 46400 1727204523.50080: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034 `" && echo ansible-tmp-1727204523.5001621-47393-4314678103034="` echo /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034 `" ) && sleep 0' 46400 1727204523.51709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204523.51720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.51729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.51746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.51789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.51795: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204523.51805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.51819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.51826: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204523.51835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204523.51840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.51849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.51863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.51872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.51880: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204523.51889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.51972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.51981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.51989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.52730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.54627: stdout chunk (state=3): >>>ansible-tmp-1727204523.5001621-47393-4314678103034=/root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034 <<< 46400 1727204523.54797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204523.54801: stdout chunk (state=3): >>><<< 46400 1727204523.54807: stderr chunk (state=3): >>><<< 46400 1727204523.54828: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204523.5001621-47393-4314678103034=/root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204523.54881: variable 'ansible_module_compression' from source: unknown 46400 1727204523.54945: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204523.54983: variable 'ansible_facts' from source: unknown 46400 1727204523.55071: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/AnsiballZ_stat.py 46400 1727204523.55798: Sending initial data 46400 1727204523.55801: Sent initial data (151 bytes) 46400 1727204523.57926: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.57930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.57973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.57978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.57994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204523.58000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.58017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204523.58021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.58168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.58212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.58221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.58292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.60033: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204523.60069: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204523.60109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp71n6x9d4 /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/AnsiballZ_stat.py <<< 46400 1727204523.60146: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204523.61572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204523.61652: stderr chunk (state=3): >>><<< 46400 1727204523.61656: stdout chunk (state=3): >>><<< 46400 1727204523.61688: done transferring module to remote 46400 1727204523.61698: _low_level_execute_command(): starting 46400 1727204523.61702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/ /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/AnsiballZ_stat.py && sleep 0' 46400 1727204523.64008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204523.64016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.64026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.64041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.64090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.64097: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204523.64107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.64120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.64181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204523.64191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204523.64200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.64213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.64224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.64233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.64239: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204523.64248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.64389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.64405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.64416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.64511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.66369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204523.66374: stdout chunk (state=3): >>><<< 46400 1727204523.66378: stderr chunk (state=3): >>><<< 46400 1727204523.66401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204523.66405: _low_level_execute_command(): starting 46400 1727204523.66410: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/AnsiballZ_stat.py && sleep 0' 46400 1727204523.68227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204523.68283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.68294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.68307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.68466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.68470: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204523.68481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.68495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.68502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204523.68509: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204523.68516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.68525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.68536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.68543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.68550: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204523.68574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.68645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.68789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.68800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.68950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.82318: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204523.83386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204523.83487: stderr chunk (state=3): >>><<< 46400 1727204523.83491: stdout chunk (state=3): >>><<< 46400 1727204523.83510: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204523.83539: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204523.83549: _low_level_execute_command(): starting 46400 1727204523.83554: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204523.5001621-47393-4314678103034/ > /dev/null 2>&1 && sleep 0' 46400 1727204523.84768: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204523.84988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.85087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.85105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.85150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.85169: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204523.85185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.85202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204523.85213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204523.85223: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204523.85234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204523.85248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204523.85268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204523.85286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204523.85298: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204523.85311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204523.85585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204523.85610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204523.85627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204523.85700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204523.87592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204523.87636: stderr chunk (state=3): >>><<< 46400 1727204523.87640: stdout chunk (state=3): >>><<< 46400 1727204523.87770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204523.87774: handler run complete 46400 1727204523.87776: attempt loop complete, returning result 46400 1727204523.87778: _execute() done 46400 1727204523.87781: dumping result to json 46400 1727204523.87783: done dumping result, returning 46400 1727204523.87785: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-0000000003ff] 46400 1727204523.87787: sending task result for task 0affcd87-79f5-1303-fda8-0000000003ff 46400 1727204523.87859: done sending task result for task 0affcd87-79f5-1303-fda8-0000000003ff 46400 1727204523.87867: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204523.87927: no more pending results, returning what we have 46400 1727204523.87931: results queue empty 46400 1727204523.87932: checking for any_errors_fatal 46400 1727204523.87941: done checking for any_errors_fatal 46400 1727204523.87941: checking for max_fail_percentage 46400 1727204523.87943: done checking for max_fail_percentage 46400 1727204523.87944: checking to see if all hosts have failed and the running result is not ok 46400 1727204523.87945: done checking to see if all hosts have failed 46400 1727204523.87945: getting the remaining hosts for this loop 46400 1727204523.87948: done getting the remaining hosts for this loop 46400 1727204523.87952: getting the next task for host managed-node2 46400 1727204523.87966: done getting next task for host managed-node2 46400 1727204523.87970: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204523.87976: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204523.87980: getting variables 46400 1727204523.87982: in VariableManager get_vars() 46400 1727204523.88018: Calling all_inventory to load vars for managed-node2 46400 1727204523.88021: Calling groups_inventory to load vars for managed-node2 46400 1727204523.88025: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204523.88037: Calling all_plugins_play to load vars for managed-node2 46400 1727204523.88041: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204523.88044: Calling groups_plugins_play to load vars for managed-node2 46400 1727204523.90902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204523.94319: done with get_vars() 46400 1727204523.94353: done getting variables 46400 1727204523.94423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:02:03 -0400 (0:00:00.525) 0:00:14.229 ***** 46400 1727204523.94462: entering _queue_task() for managed-node2/set_fact 46400 1727204523.95419: worker is 1 (out of 1 available) 46400 1727204523.95433: exiting _queue_task() for managed-node2/set_fact 46400 1727204523.95447: done queuing things up, now waiting for results queue to drain 46400 1727204523.95448: waiting for pending results... 46400 1727204523.96375: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204523.96768: in run() - task 0affcd87-79f5-1303-fda8-000000000400 46400 1727204523.96779: variable 'ansible_search_path' from source: unknown 46400 1727204523.96782: variable 'ansible_search_path' from source: unknown 46400 1727204523.96946: calling self._execute() 46400 1727204523.97171: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204523.97178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204523.97188: variable 'omit' from source: magic vars 46400 1727204523.98035: variable 'ansible_distribution_major_version' from source: facts 46400 1727204523.98049: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204523.98292: variable 'profile_stat' from source: set_fact 46400 1727204523.98302: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204523.98305: when evaluation is False, skipping this task 46400 1727204523.98308: _execute() done 46400 1727204523.98311: dumping result to json 46400 1727204523.98313: done dumping result, returning 46400 1727204523.98319: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-000000000400] 46400 1727204523.98326: sending task result for task 0affcd87-79f5-1303-fda8-000000000400 46400 1727204523.98542: done sending task result for task 0affcd87-79f5-1303-fda8-000000000400 46400 1727204523.98544: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204523.98600: no more pending results, returning what we have 46400 1727204523.98606: results queue empty 46400 1727204523.98608: checking for any_errors_fatal 46400 1727204523.98634: done checking for any_errors_fatal 46400 1727204523.98635: checking for max_fail_percentage 46400 1727204523.98638: done checking for max_fail_percentage 46400 1727204523.98639: checking to see if all hosts have failed and the running result is not ok 46400 1727204523.98640: done checking to see if all hosts have failed 46400 1727204523.98641: getting the remaining hosts for this loop 46400 1727204523.98642: done getting the remaining hosts for this loop 46400 1727204523.98647: getting the next task for host managed-node2 46400 1727204523.98657: done getting next task for host managed-node2 46400 1727204523.98665: ^ task is: TASK: Get NM profile info 46400 1727204523.98672: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204523.98677: getting variables 46400 1727204523.98679: in VariableManager get_vars() 46400 1727204523.98711: Calling all_inventory to load vars for managed-node2 46400 1727204523.98713: Calling groups_inventory to load vars for managed-node2 46400 1727204523.98717: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204523.98729: Calling all_plugins_play to load vars for managed-node2 46400 1727204523.98732: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204523.98734: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.01477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.04840: done with get_vars() 46400 1727204524.04874: done getting variables 46400 1727204524.05179: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.107) 0:00:14.336 ***** 46400 1727204524.05215: entering _queue_task() for managed-node2/shell 46400 1727204524.05217: Creating lock for shell 46400 1727204524.05971: worker is 1 (out of 1 available) 46400 1727204524.05984: exiting _queue_task() for managed-node2/shell 46400 1727204524.05996: done queuing things up, now waiting for results queue to drain 46400 1727204524.05997: waiting for pending results... 46400 1727204524.06709: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204524.07104: in run() - task 0affcd87-79f5-1303-fda8-000000000401 46400 1727204524.07124: variable 'ansible_search_path' from source: unknown 46400 1727204524.07131: variable 'ansible_search_path' from source: unknown 46400 1727204524.07289: calling self._execute() 46400 1727204524.07496: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.07512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.07526: variable 'omit' from source: magic vars 46400 1727204524.08242: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.08378: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.08392: variable 'omit' from source: magic vars 46400 1727204524.08462: variable 'omit' from source: magic vars 46400 1727204524.08713: variable 'profile' from source: play vars 46400 1727204524.08723: variable 'interface' from source: play vars 46400 1727204524.08910: variable 'interface' from source: play vars 46400 1727204524.08937: variable 'omit' from source: magic vars 46400 1727204524.08987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204524.09070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204524.09248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204524.09273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.09288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.09319: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204524.09327: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.09341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.09542: Set connection var ansible_shell_type to sh 46400 1727204524.09670: Set connection var ansible_shell_executable to /bin/sh 46400 1727204524.09685: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204524.09695: Set connection var ansible_connection to ssh 46400 1727204524.09704: Set connection var ansible_pipelining to False 46400 1727204524.09713: Set connection var ansible_timeout to 10 46400 1727204524.09741: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.09748: variable 'ansible_connection' from source: unknown 46400 1727204524.09754: variable 'ansible_module_compression' from source: unknown 46400 1727204524.09760: variable 'ansible_shell_type' from source: unknown 46400 1727204524.09774: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.09782: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.09792: variable 'ansible_pipelining' from source: unknown 46400 1727204524.09799: variable 'ansible_timeout' from source: unknown 46400 1727204524.09805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.10145: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204524.10162: variable 'omit' from source: magic vars 46400 1727204524.10175: starting attempt loop 46400 1727204524.10181: running the handler 46400 1727204524.10193: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204524.10244: _low_level_execute_command(): starting 46400 1727204524.10330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204524.12262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.12275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.12413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204524.12417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204524.12421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204524.12423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.12470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204524.12504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.12508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.12643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.14289: stdout chunk (state=3): >>>/root <<< 46400 1727204524.14397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204524.14487: stderr chunk (state=3): >>><<< 46400 1727204524.14490: stdout chunk (state=3): >>><<< 46400 1727204524.14617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204524.14621: _low_level_execute_command(): starting 46400 1727204524.14625: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271 `" && echo ansible-tmp-1727204524.1451695-47410-149347954030271="` echo /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271 `" ) && sleep 0' 46400 1727204524.15946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.15950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.15979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204524.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.16121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.16124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.16222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204524.16332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.16336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.16393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.18286: stdout chunk (state=3): >>>ansible-tmp-1727204524.1451695-47410-149347954030271=/root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271 <<< 46400 1727204524.18406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204524.18482: stderr chunk (state=3): >>><<< 46400 1727204524.18486: stdout chunk (state=3): >>><<< 46400 1727204524.18776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204524.1451695-47410-149347954030271=/root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204524.18779: variable 'ansible_module_compression' from source: unknown 46400 1727204524.18782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204524.18784: variable 'ansible_facts' from source: unknown 46400 1727204524.18786: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/AnsiballZ_command.py 46400 1727204524.19327: Sending initial data 46400 1727204524.19331: Sent initial data (156 bytes) 46400 1727204524.21891: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204524.21949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.22058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.22082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.22124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.22135: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204524.22152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.22180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204524.22192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204524.22202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204524.22213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.22225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.22238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.22249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.22263: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204524.22285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.22361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204524.22500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.22519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.22708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.24462: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204524.24497: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204524.24537: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpllgvh4vo /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/AnsiballZ_command.py <<< 46400 1727204524.24574: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204524.25914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204524.26069: stderr chunk (state=3): >>><<< 46400 1727204524.26073: stdout chunk (state=3): >>><<< 46400 1727204524.26075: done transferring module to remote 46400 1727204524.26078: _low_level_execute_command(): starting 46400 1727204524.26080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/ /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/AnsiballZ_command.py && sleep 0' 46400 1727204524.27028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204524.27043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.27057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.27092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.27134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.27146: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204524.27160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.27188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204524.27201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204524.27213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204524.27224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.27236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.27252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.27263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.27275: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204524.27298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.27376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204524.27393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.27417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.27502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.29395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204524.29399: stdout chunk (state=3): >>><<< 46400 1727204524.29401: stderr chunk (state=3): >>><<< 46400 1727204524.29499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204524.29503: _low_level_execute_command(): starting 46400 1727204524.29506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/AnsiballZ_command.py && sleep 0' 46400 1727204524.30496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204524.30519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.30533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.30549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.30591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.30602: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204524.30626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.30643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204524.30653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204524.30663: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204524.30676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.30688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204524.30702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.30712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204524.30729: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204524.30746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.30870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204524.30893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.30910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.30993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.46270: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:04.442871", "end": "2024-09-24 15:02:04.461664", "delta": "0:00:00.018793", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204524.47566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204524.47572: stdout chunk (state=3): >>><<< 46400 1727204524.47574: stderr chunk (state=3): >>><<< 46400 1727204524.47739: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:04.442871", "end": "2024-09-24 15:02:04.461664", "delta": "0:00:00.018793", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204524.47749: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204524.47752: _low_level_execute_command(): starting 46400 1727204524.47754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204524.1451695-47410-149347954030271/ > /dev/null 2>&1 && sleep 0' 46400 1727204524.48308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204524.48312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204524.48331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204524.48335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204524.48420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204524.48436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204524.48509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204524.50402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204524.50408: stdout chunk (state=3): >>><<< 46400 1727204524.50411: stderr chunk (state=3): >>><<< 46400 1727204524.50572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204524.50576: handler run complete 46400 1727204524.50579: Evaluated conditional (False): False 46400 1727204524.50581: attempt loop complete, returning result 46400 1727204524.50585: _execute() done 46400 1727204524.50587: dumping result to json 46400 1727204524.50589: done dumping result, returning 46400 1727204524.50594: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-000000000401] 46400 1727204524.50596: sending task result for task 0affcd87-79f5-1303-fda8-000000000401 46400 1727204524.50752: done sending task result for task 0affcd87-79f5-1303-fda8-000000000401 46400 1727204524.50756: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018793", "end": "2024-09-24 15:02:04.461664", "rc": 0, "start": "2024-09-24 15:02:04.442871" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 46400 1727204524.50857: no more pending results, returning what we have 46400 1727204524.50862: results queue empty 46400 1727204524.50863: checking for any_errors_fatal 46400 1727204524.50873: done checking for any_errors_fatal 46400 1727204524.50874: checking for max_fail_percentage 46400 1727204524.50879: done checking for max_fail_percentage 46400 1727204524.50881: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.50882: done checking to see if all hosts have failed 46400 1727204524.50883: getting the remaining hosts for this loop 46400 1727204524.50885: done getting the remaining hosts for this loop 46400 1727204524.50892: getting the next task for host managed-node2 46400 1727204524.50906: done getting next task for host managed-node2 46400 1727204524.50909: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204524.50914: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.50918: getting variables 46400 1727204524.50920: in VariableManager get_vars() 46400 1727204524.50956: Calling all_inventory to load vars for managed-node2 46400 1727204524.50959: Calling groups_inventory to load vars for managed-node2 46400 1727204524.50965: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.50986: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.50989: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.50993: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.52982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.55024: done with get_vars() 46400 1727204524.55054: done getting variables 46400 1727204524.55128: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.499) 0:00:14.836 ***** 46400 1727204524.55173: entering _queue_task() for managed-node2/set_fact 46400 1727204524.55522: worker is 1 (out of 1 available) 46400 1727204524.55534: exiting _queue_task() for managed-node2/set_fact 46400 1727204524.55551: done queuing things up, now waiting for results queue to drain 46400 1727204524.55553: waiting for pending results... 46400 1727204524.55858: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204524.55997: in run() - task 0affcd87-79f5-1303-fda8-000000000402 46400 1727204524.56019: variable 'ansible_search_path' from source: unknown 46400 1727204524.56026: variable 'ansible_search_path' from source: unknown 46400 1727204524.56073: calling self._execute() 46400 1727204524.56183: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.56200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.56217: variable 'omit' from source: magic vars 46400 1727204524.56652: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.56683: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.56874: variable 'nm_profile_exists' from source: set_fact 46400 1727204524.56897: Evaluated conditional (nm_profile_exists.rc == 0): True 46400 1727204524.56909: variable 'omit' from source: magic vars 46400 1727204524.57001: variable 'omit' from source: magic vars 46400 1727204524.57043: variable 'omit' from source: magic vars 46400 1727204524.57117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204524.57173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204524.57210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204524.57234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.57254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.57313: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204524.57324: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.57339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.57469: Set connection var ansible_shell_type to sh 46400 1727204524.57486: Set connection var ansible_shell_executable to /bin/sh 46400 1727204524.57496: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204524.57511: Set connection var ansible_connection to ssh 46400 1727204524.57524: Set connection var ansible_pipelining to False 46400 1727204524.57535: Set connection var ansible_timeout to 10 46400 1727204524.57582: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.57593: variable 'ansible_connection' from source: unknown 46400 1727204524.57602: variable 'ansible_module_compression' from source: unknown 46400 1727204524.57621: variable 'ansible_shell_type' from source: unknown 46400 1727204524.57634: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.57643: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.57659: variable 'ansible_pipelining' from source: unknown 46400 1727204524.57674: variable 'ansible_timeout' from source: unknown 46400 1727204524.57688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.57903: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204524.57928: variable 'omit' from source: magic vars 46400 1727204524.57941: starting attempt loop 46400 1727204524.57958: running the handler 46400 1727204524.57982: handler run complete 46400 1727204524.57997: attempt loop complete, returning result 46400 1727204524.58007: _execute() done 46400 1727204524.58020: dumping result to json 46400 1727204524.58032: done dumping result, returning 46400 1727204524.58052: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-000000000402] 46400 1727204524.58075: sending task result for task 0affcd87-79f5-1303-fda8-000000000402 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 46400 1727204524.58274: no more pending results, returning what we have 46400 1727204524.58278: results queue empty 46400 1727204524.58279: checking for any_errors_fatal 46400 1727204524.58291: done checking for any_errors_fatal 46400 1727204524.58292: checking for max_fail_percentage 46400 1727204524.58294: done checking for max_fail_percentage 46400 1727204524.58296: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.58297: done checking to see if all hosts have failed 46400 1727204524.58298: getting the remaining hosts for this loop 46400 1727204524.58299: done getting the remaining hosts for this loop 46400 1727204524.58304: getting the next task for host managed-node2 46400 1727204524.58320: done getting next task for host managed-node2 46400 1727204524.58324: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204524.58329: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.58333: getting variables 46400 1727204524.58335: in VariableManager get_vars() 46400 1727204524.58376: Calling all_inventory to load vars for managed-node2 46400 1727204524.58382: Calling groups_inventory to load vars for managed-node2 46400 1727204524.58387: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.58401: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.58405: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.58408: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.59406: done sending task result for task 0affcd87-79f5-1303-fda8-000000000402 46400 1727204524.59409: WORKER PROCESS EXITING 46400 1727204524.60208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.62083: done with get_vars() 46400 1727204524.62110: done getting variables 46400 1727204524.62181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.62323: variable 'profile' from source: play vars 46400 1727204524.62328: variable 'interface' from source: play vars 46400 1727204524.62404: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.072) 0:00:14.909 ***** 46400 1727204524.62448: entering _queue_task() for managed-node2/command 46400 1727204524.62854: worker is 1 (out of 1 available) 46400 1727204524.62868: exiting _queue_task() for managed-node2/command 46400 1727204524.62881: done queuing things up, now waiting for results queue to drain 46400 1727204524.62883: waiting for pending results... 46400 1727204524.63204: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204524.63372: in run() - task 0affcd87-79f5-1303-fda8-000000000404 46400 1727204524.63399: variable 'ansible_search_path' from source: unknown 46400 1727204524.63408: variable 'ansible_search_path' from source: unknown 46400 1727204524.63468: calling self._execute() 46400 1727204524.63580: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.63593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.63610: variable 'omit' from source: magic vars 46400 1727204524.64073: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.64101: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.64259: variable 'profile_stat' from source: set_fact 46400 1727204524.64283: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204524.64293: when evaluation is False, skipping this task 46400 1727204524.64302: _execute() done 46400 1727204524.64311: dumping result to json 46400 1727204524.64329: done dumping result, returning 46400 1727204524.64343: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000404] 46400 1727204524.64354: sending task result for task 0affcd87-79f5-1303-fda8-000000000404 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204524.64520: no more pending results, returning what we have 46400 1727204524.64525: results queue empty 46400 1727204524.64526: checking for any_errors_fatal 46400 1727204524.64534: done checking for any_errors_fatal 46400 1727204524.64535: checking for max_fail_percentage 46400 1727204524.64537: done checking for max_fail_percentage 46400 1727204524.64538: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.64539: done checking to see if all hosts have failed 46400 1727204524.64539: getting the remaining hosts for this loop 46400 1727204524.64541: done getting the remaining hosts for this loop 46400 1727204524.64545: getting the next task for host managed-node2 46400 1727204524.64557: done getting next task for host managed-node2 46400 1727204524.64562: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204524.64574: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.64579: getting variables 46400 1727204524.64581: in VariableManager get_vars() 46400 1727204524.64619: Calling all_inventory to load vars for managed-node2 46400 1727204524.64622: Calling groups_inventory to load vars for managed-node2 46400 1727204524.64627: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.64646: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.64650: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.64654: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.65628: done sending task result for task 0affcd87-79f5-1303-fda8-000000000404 46400 1727204524.65632: WORKER PROCESS EXITING 46400 1727204524.67086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.69150: done with get_vars() 46400 1727204524.69194: done getting variables 46400 1727204524.69322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.69527: variable 'profile' from source: play vars 46400 1727204524.69531: variable 'interface' from source: play vars 46400 1727204524.69605: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.071) 0:00:14.980 ***** 46400 1727204524.69643: entering _queue_task() for managed-node2/set_fact 46400 1727204524.70088: worker is 1 (out of 1 available) 46400 1727204524.70100: exiting _queue_task() for managed-node2/set_fact 46400 1727204524.70112: done queuing things up, now waiting for results queue to drain 46400 1727204524.70114: waiting for pending results... 46400 1727204524.70491: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204524.70657: in run() - task 0affcd87-79f5-1303-fda8-000000000405 46400 1727204524.70683: variable 'ansible_search_path' from source: unknown 46400 1727204524.70695: variable 'ansible_search_path' from source: unknown 46400 1727204524.70740: calling self._execute() 46400 1727204524.70847: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.70860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.70876: variable 'omit' from source: magic vars 46400 1727204524.71289: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.71307: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.71488: variable 'profile_stat' from source: set_fact 46400 1727204524.71504: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204524.71511: when evaluation is False, skipping this task 46400 1727204524.71517: _execute() done 46400 1727204524.71524: dumping result to json 46400 1727204524.71534: done dumping result, returning 46400 1727204524.71559: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000405] 46400 1727204524.71583: sending task result for task 0affcd87-79f5-1303-fda8-000000000405 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204524.71755: no more pending results, returning what we have 46400 1727204524.71760: results queue empty 46400 1727204524.71761: checking for any_errors_fatal 46400 1727204524.71769: done checking for any_errors_fatal 46400 1727204524.71770: checking for max_fail_percentage 46400 1727204524.71772: done checking for max_fail_percentage 46400 1727204524.71773: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.71774: done checking to see if all hosts have failed 46400 1727204524.71775: getting the remaining hosts for this loop 46400 1727204524.71777: done getting the remaining hosts for this loop 46400 1727204524.71781: getting the next task for host managed-node2 46400 1727204524.71791: done getting next task for host managed-node2 46400 1727204524.71794: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204524.71800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.71803: getting variables 46400 1727204524.71805: in VariableManager get_vars() 46400 1727204524.71842: Calling all_inventory to load vars for managed-node2 46400 1727204524.71847: Calling groups_inventory to load vars for managed-node2 46400 1727204524.71851: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.71867: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.71870: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.71874: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.72934: done sending task result for task 0affcd87-79f5-1303-fda8-000000000405 46400 1727204524.72939: WORKER PROCESS EXITING 46400 1727204524.74247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.76485: done with get_vars() 46400 1727204524.76519: done getting variables 46400 1727204524.76595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.76739: variable 'profile' from source: play vars 46400 1727204524.76743: variable 'interface' from source: play vars 46400 1727204524.76841: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.072) 0:00:15.053 ***** 46400 1727204524.76875: entering _queue_task() for managed-node2/command 46400 1727204524.77228: worker is 1 (out of 1 available) 46400 1727204524.77242: exiting _queue_task() for managed-node2/command 46400 1727204524.77254: done queuing things up, now waiting for results queue to drain 46400 1727204524.77256: waiting for pending results... 46400 1727204524.77555: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204524.77695: in run() - task 0affcd87-79f5-1303-fda8-000000000406 46400 1727204524.77720: variable 'ansible_search_path' from source: unknown 46400 1727204524.77728: variable 'ansible_search_path' from source: unknown 46400 1727204524.77774: calling self._execute() 46400 1727204524.77880: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.77891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.77904: variable 'omit' from source: magic vars 46400 1727204524.78285: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.78305: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.78434: variable 'profile_stat' from source: set_fact 46400 1727204524.78489: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204524.78496: when evaluation is False, skipping this task 46400 1727204524.78502: _execute() done 46400 1727204524.78509: dumping result to json 46400 1727204524.78519: done dumping result, returning 46400 1727204524.78529: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000406] 46400 1727204524.78539: sending task result for task 0affcd87-79f5-1303-fda8-000000000406 46400 1727204524.78653: done sending task result for task 0affcd87-79f5-1303-fda8-000000000406 46400 1727204524.78661: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204524.78723: no more pending results, returning what we have 46400 1727204524.78728: results queue empty 46400 1727204524.78730: checking for any_errors_fatal 46400 1727204524.78737: done checking for any_errors_fatal 46400 1727204524.78738: checking for max_fail_percentage 46400 1727204524.78740: done checking for max_fail_percentage 46400 1727204524.78741: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.78742: done checking to see if all hosts have failed 46400 1727204524.78743: getting the remaining hosts for this loop 46400 1727204524.78744: done getting the remaining hosts for this loop 46400 1727204524.78750: getting the next task for host managed-node2 46400 1727204524.78760: done getting next task for host managed-node2 46400 1727204524.78763: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204524.78770: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.78776: getting variables 46400 1727204524.78777: in VariableManager get_vars() 46400 1727204524.78811: Calling all_inventory to load vars for managed-node2 46400 1727204524.78814: Calling groups_inventory to load vars for managed-node2 46400 1727204524.78818: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.78833: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.78836: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.78840: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.80793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.81971: done with get_vars() 46400 1727204524.81995: done getting variables 46400 1727204524.82059: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.82202: variable 'profile' from source: play vars 46400 1727204524.82207: variable 'interface' from source: play vars 46400 1727204524.82286: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.054) 0:00:15.107 ***** 46400 1727204524.82320: entering _queue_task() for managed-node2/set_fact 46400 1727204524.82671: worker is 1 (out of 1 available) 46400 1727204524.82684: exiting _queue_task() for managed-node2/set_fact 46400 1727204524.82697: done queuing things up, now waiting for results queue to drain 46400 1727204524.82699: waiting for pending results... 46400 1727204524.83328: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204524.83469: in run() - task 0affcd87-79f5-1303-fda8-000000000407 46400 1727204524.83488: variable 'ansible_search_path' from source: unknown 46400 1727204524.83495: variable 'ansible_search_path' from source: unknown 46400 1727204524.83552: calling self._execute() 46400 1727204524.83699: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.83711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.83724: variable 'omit' from source: magic vars 46400 1727204524.84133: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.84150: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.84282: variable 'profile_stat' from source: set_fact 46400 1727204524.84304: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204524.84312: when evaluation is False, skipping this task 46400 1727204524.84317: _execute() done 46400 1727204524.84323: dumping result to json 46400 1727204524.84329: done dumping result, returning 46400 1727204524.84339: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000407] 46400 1727204524.84351: sending task result for task 0affcd87-79f5-1303-fda8-000000000407 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204524.84513: no more pending results, returning what we have 46400 1727204524.84518: results queue empty 46400 1727204524.84520: checking for any_errors_fatal 46400 1727204524.84530: done checking for any_errors_fatal 46400 1727204524.84530: checking for max_fail_percentage 46400 1727204524.84532: done checking for max_fail_percentage 46400 1727204524.84534: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.84535: done checking to see if all hosts have failed 46400 1727204524.84535: getting the remaining hosts for this loop 46400 1727204524.84537: done getting the remaining hosts for this loop 46400 1727204524.84542: getting the next task for host managed-node2 46400 1727204524.84554: done getting next task for host managed-node2 46400 1727204524.84557: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 46400 1727204524.84562: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.84571: getting variables 46400 1727204524.84573: in VariableManager get_vars() 46400 1727204524.84612: Calling all_inventory to load vars for managed-node2 46400 1727204524.84615: Calling groups_inventory to load vars for managed-node2 46400 1727204524.84620: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.84635: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.84639: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.84642: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.86084: done sending task result for task 0affcd87-79f5-1303-fda8-000000000407 46400 1727204524.86088: WORKER PROCESS EXITING 46400 1727204524.87357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.89199: done with get_vars() 46400 1727204524.89221: done getting variables 46400 1727204524.89289: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.89405: variable 'profile' from source: play vars 46400 1727204524.89409: variable 'interface' from source: play vars 46400 1727204524.89463: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.071) 0:00:15.179 ***** 46400 1727204524.89495: entering _queue_task() for managed-node2/assert 46400 1727204524.89830: worker is 1 (out of 1 available) 46400 1727204524.89845: exiting _queue_task() for managed-node2/assert 46400 1727204524.89859: done queuing things up, now waiting for results queue to drain 46400 1727204524.89861: waiting for pending results... 46400 1727204524.90213: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 46400 1727204524.90368: in run() - task 0affcd87-79f5-1303-fda8-000000000384 46400 1727204524.90389: variable 'ansible_search_path' from source: unknown 46400 1727204524.90397: variable 'ansible_search_path' from source: unknown 46400 1727204524.90445: calling self._execute() 46400 1727204524.90551: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.90563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.90580: variable 'omit' from source: magic vars 46400 1727204524.91017: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.91041: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.91053: variable 'omit' from source: magic vars 46400 1727204524.91124: variable 'omit' from source: magic vars 46400 1727204524.91297: variable 'profile' from source: play vars 46400 1727204524.91307: variable 'interface' from source: play vars 46400 1727204524.91375: variable 'interface' from source: play vars 46400 1727204524.91404: variable 'omit' from source: magic vars 46400 1727204524.91451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204524.91500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204524.91528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204524.91589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.91609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.91643: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204524.91653: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.91661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.91767: Set connection var ansible_shell_type to sh 46400 1727204524.91784: Set connection var ansible_shell_executable to /bin/sh 46400 1727204524.91795: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204524.91804: Set connection var ansible_connection to ssh 46400 1727204524.91813: Set connection var ansible_pipelining to False 46400 1727204524.91826: Set connection var ansible_timeout to 10 46400 1727204524.91855: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.91865: variable 'ansible_connection' from source: unknown 46400 1727204524.91874: variable 'ansible_module_compression' from source: unknown 46400 1727204524.91880: variable 'ansible_shell_type' from source: unknown 46400 1727204524.91887: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.91894: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.91901: variable 'ansible_pipelining' from source: unknown 46400 1727204524.91915: variable 'ansible_timeout' from source: unknown 46400 1727204524.91927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.92521: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204524.92548: variable 'omit' from source: magic vars 46400 1727204524.92572: starting attempt loop 46400 1727204524.92593: running the handler 46400 1727204524.92770: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204524.92795: Evaluated conditional (lsr_net_profile_exists): True 46400 1727204524.92809: handler run complete 46400 1727204524.92835: attempt loop complete, returning result 46400 1727204524.92848: _execute() done 46400 1727204524.92857: dumping result to json 46400 1727204524.92870: done dumping result, returning 46400 1727204524.92906: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [0affcd87-79f5-1303-fda8-000000000384] 46400 1727204524.92918: sending task result for task 0affcd87-79f5-1303-fda8-000000000384 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204524.93085: no more pending results, returning what we have 46400 1727204524.93090: results queue empty 46400 1727204524.93091: checking for any_errors_fatal 46400 1727204524.93099: done checking for any_errors_fatal 46400 1727204524.93100: checking for max_fail_percentage 46400 1727204524.93102: done checking for max_fail_percentage 46400 1727204524.93103: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.93104: done checking to see if all hosts have failed 46400 1727204524.93105: getting the remaining hosts for this loop 46400 1727204524.93106: done getting the remaining hosts for this loop 46400 1727204524.93111: getting the next task for host managed-node2 46400 1727204524.93119: done getting next task for host managed-node2 46400 1727204524.93122: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 46400 1727204524.93126: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.93130: getting variables 46400 1727204524.93132: in VariableManager get_vars() 46400 1727204524.93168: Calling all_inventory to load vars for managed-node2 46400 1727204524.93172: Calling groups_inventory to load vars for managed-node2 46400 1727204524.93177: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.93190: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.93193: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.93196: Calling groups_plugins_play to load vars for managed-node2 46400 1727204524.94184: done sending task result for task 0affcd87-79f5-1303-fda8-000000000384 46400 1727204524.94187: WORKER PROCESS EXITING 46400 1727204524.94799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204524.96348: done with get_vars() 46400 1727204524.96377: done getting variables 46400 1727204524.96432: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204524.96557: variable 'profile' from source: play vars 46400 1727204524.96561: variable 'interface' from source: play vars 46400 1727204524.96628: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:02:04 -0400 (0:00:00.071) 0:00:15.251 ***** 46400 1727204524.96675: entering _queue_task() for managed-node2/assert 46400 1727204524.97041: worker is 1 (out of 1 available) 46400 1727204524.97054: exiting _queue_task() for managed-node2/assert 46400 1727204524.97069: done queuing things up, now waiting for results queue to drain 46400 1727204524.97071: waiting for pending results... 46400 1727204524.97414: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 46400 1727204524.97430: in run() - task 0affcd87-79f5-1303-fda8-000000000385 46400 1727204524.97446: variable 'ansible_search_path' from source: unknown 46400 1727204524.97449: variable 'ansible_search_path' from source: unknown 46400 1727204524.97511: calling self._execute() 46400 1727204524.97874: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.97878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.97881: variable 'omit' from source: magic vars 46400 1727204524.98652: variable 'ansible_distribution_major_version' from source: facts 46400 1727204524.98667: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204524.98673: variable 'omit' from source: magic vars 46400 1727204524.98735: variable 'omit' from source: magic vars 46400 1727204524.98886: variable 'profile' from source: play vars 46400 1727204524.98890: variable 'interface' from source: play vars 46400 1727204524.98971: variable 'interface' from source: play vars 46400 1727204524.98983: variable 'omit' from source: magic vars 46400 1727204524.99029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204524.99070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204524.99093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204524.99110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.99121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204524.99150: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204524.99154: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.99156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.99255: Set connection var ansible_shell_type to sh 46400 1727204524.99267: Set connection var ansible_shell_executable to /bin/sh 46400 1727204524.99282: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204524.99289: Set connection var ansible_connection to ssh 46400 1727204524.99292: Set connection var ansible_pipelining to False 46400 1727204524.99301: Set connection var ansible_timeout to 10 46400 1727204524.99323: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.99326: variable 'ansible_connection' from source: unknown 46400 1727204524.99329: variable 'ansible_module_compression' from source: unknown 46400 1727204524.99331: variable 'ansible_shell_type' from source: unknown 46400 1727204524.99333: variable 'ansible_shell_executable' from source: unknown 46400 1727204524.99335: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204524.99340: variable 'ansible_pipelining' from source: unknown 46400 1727204524.99342: variable 'ansible_timeout' from source: unknown 46400 1727204524.99346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204524.99491: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204524.99505: variable 'omit' from source: magic vars 46400 1727204524.99512: starting attempt loop 46400 1727204524.99515: running the handler 46400 1727204524.99634: variable 'lsr_net_profile_ansible_managed' from source: set_fact 46400 1727204524.99638: Evaluated conditional (lsr_net_profile_ansible_managed): True 46400 1727204524.99641: handler run complete 46400 1727204524.99656: attempt loop complete, returning result 46400 1727204524.99658: _execute() done 46400 1727204524.99665: dumping result to json 46400 1727204524.99669: done dumping result, returning 46400 1727204524.99673: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcd87-79f5-1303-fda8-000000000385] 46400 1727204524.99676: sending task result for task 0affcd87-79f5-1303-fda8-000000000385 46400 1727204524.99770: done sending task result for task 0affcd87-79f5-1303-fda8-000000000385 46400 1727204524.99773: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204524.99821: no more pending results, returning what we have 46400 1727204524.99825: results queue empty 46400 1727204524.99826: checking for any_errors_fatal 46400 1727204524.99834: done checking for any_errors_fatal 46400 1727204524.99834: checking for max_fail_percentage 46400 1727204524.99836: done checking for max_fail_percentage 46400 1727204524.99837: checking to see if all hosts have failed and the running result is not ok 46400 1727204524.99838: done checking to see if all hosts have failed 46400 1727204524.99839: getting the remaining hosts for this loop 46400 1727204524.99840: done getting the remaining hosts for this loop 46400 1727204524.99844: getting the next task for host managed-node2 46400 1727204524.99852: done getting next task for host managed-node2 46400 1727204524.99854: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 46400 1727204524.99858: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204524.99867: getting variables 46400 1727204524.99869: in VariableManager get_vars() 46400 1727204524.99905: Calling all_inventory to load vars for managed-node2 46400 1727204524.99907: Calling groups_inventory to load vars for managed-node2 46400 1727204524.99911: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204524.99922: Calling all_plugins_play to load vars for managed-node2 46400 1727204524.99924: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204524.99927: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.01744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.02809: done with get_vars() 46400 1727204525.02837: done getting variables 46400 1727204525.02903: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204525.03034: variable 'profile' from source: play vars 46400 1727204525.03048: variable 'interface' from source: play vars 46400 1727204525.03123: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.064) 0:00:15.316 ***** 46400 1727204525.03163: entering _queue_task() for managed-node2/assert 46400 1727204525.03519: worker is 1 (out of 1 available) 46400 1727204525.03535: exiting _queue_task() for managed-node2/assert 46400 1727204525.03548: done queuing things up, now waiting for results queue to drain 46400 1727204525.03550: waiting for pending results... 46400 1727204525.03874: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 46400 1727204525.04108: in run() - task 0affcd87-79f5-1303-fda8-000000000386 46400 1727204525.04138: variable 'ansible_search_path' from source: unknown 46400 1727204525.04146: variable 'ansible_search_path' from source: unknown 46400 1727204525.04195: calling self._execute() 46400 1727204525.04298: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.04310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.04327: variable 'omit' from source: magic vars 46400 1727204525.04748: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.04757: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.04766: variable 'omit' from source: magic vars 46400 1727204525.04816: variable 'omit' from source: magic vars 46400 1727204525.04926: variable 'profile' from source: play vars 46400 1727204525.04942: variable 'interface' from source: play vars 46400 1727204525.05325: variable 'interface' from source: play vars 46400 1727204525.05450: variable 'omit' from source: magic vars 46400 1727204525.05728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204525.05802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204525.05843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204525.05877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.05895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.05944: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204525.05953: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.05966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.06073: Set connection var ansible_shell_type to sh 46400 1727204525.06092: Set connection var ansible_shell_executable to /bin/sh 46400 1727204525.06104: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204525.06113: Set connection var ansible_connection to ssh 46400 1727204525.06121: Set connection var ansible_pipelining to False 46400 1727204525.06130: Set connection var ansible_timeout to 10 46400 1727204525.06158: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.06171: variable 'ansible_connection' from source: unknown 46400 1727204525.06177: variable 'ansible_module_compression' from source: unknown 46400 1727204525.06183: variable 'ansible_shell_type' from source: unknown 46400 1727204525.06188: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.06194: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.06200: variable 'ansible_pipelining' from source: unknown 46400 1727204525.06206: variable 'ansible_timeout' from source: unknown 46400 1727204525.06214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.06391: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204525.06420: variable 'omit' from source: magic vars 46400 1727204525.06438: starting attempt loop 46400 1727204525.06447: running the handler 46400 1727204525.06609: variable 'lsr_net_profile_fingerprint' from source: set_fact 46400 1727204525.06624: Evaluated conditional (lsr_net_profile_fingerprint): True 46400 1727204525.06642: handler run complete 46400 1727204525.06679: attempt loop complete, returning result 46400 1727204525.06687: _execute() done 46400 1727204525.06695: dumping result to json 46400 1727204525.06701: done dumping result, returning 46400 1727204525.06719: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [0affcd87-79f5-1303-fda8-000000000386] 46400 1727204525.06742: sending task result for task 0affcd87-79f5-1303-fda8-000000000386 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204525.07002: no more pending results, returning what we have 46400 1727204525.07006: results queue empty 46400 1727204525.07007: checking for any_errors_fatal 46400 1727204525.07013: done checking for any_errors_fatal 46400 1727204525.07014: checking for max_fail_percentage 46400 1727204525.07016: done checking for max_fail_percentage 46400 1727204525.07017: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.07017: done checking to see if all hosts have failed 46400 1727204525.07018: getting the remaining hosts for this loop 46400 1727204525.07020: done getting the remaining hosts for this loop 46400 1727204525.07024: getting the next task for host managed-node2 46400 1727204525.07035: done getting next task for host managed-node2 46400 1727204525.07038: ^ task is: TASK: Conditional asserts 46400 1727204525.07040: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.07046: getting variables 46400 1727204525.07047: in VariableManager get_vars() 46400 1727204525.07083: Calling all_inventory to load vars for managed-node2 46400 1727204525.07086: Calling groups_inventory to load vars for managed-node2 46400 1727204525.07089: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.07102: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.07104: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.07108: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.07667: done sending task result for task 0affcd87-79f5-1303-fda8-000000000386 46400 1727204525.07671: WORKER PROCESS EXITING 46400 1727204525.09130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.11121: done with get_vars() 46400 1727204525.11180: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.081) 0:00:15.397 ***** 46400 1727204525.11287: entering _queue_task() for managed-node2/include_tasks 46400 1727204525.11535: worker is 1 (out of 1 available) 46400 1727204525.11549: exiting _queue_task() for managed-node2/include_tasks 46400 1727204525.11565: done queuing things up, now waiting for results queue to drain 46400 1727204525.11567: waiting for pending results... 46400 1727204525.11748: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204525.11821: in run() - task 0affcd87-79f5-1303-fda8-000000000097 46400 1727204525.11831: variable 'ansible_search_path' from source: unknown 46400 1727204525.11836: variable 'ansible_search_path' from source: unknown 46400 1727204525.12052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204525.14243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204525.14310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204525.14346: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204525.14382: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204525.14476: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204525.14773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204525.14778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204525.14780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204525.14782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204525.14784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204525.14787: variable 'lsr_assert_when' from source: include params 46400 1727204525.14803: variable 'network_provider' from source: set_fact 46400 1727204525.14876: variable 'omit' from source: magic vars 46400 1727204525.14995: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.15004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.15014: variable 'omit' from source: magic vars 46400 1727204525.15220: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.15229: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.15341: variable 'item' from source: unknown 46400 1727204525.15346: Evaluated conditional (item['condition']): True 46400 1727204525.15421: variable 'item' from source: unknown 46400 1727204525.15453: variable 'item' from source: unknown 46400 1727204525.15515: variable 'item' from source: unknown 46400 1727204525.15655: dumping result to json 46400 1727204525.15657: done dumping result, returning 46400 1727204525.15659: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-000000000097] 46400 1727204525.15661: sending task result for task 0affcd87-79f5-1303-fda8-000000000097 46400 1727204525.15700: done sending task result for task 0affcd87-79f5-1303-fda8-000000000097 46400 1727204525.15702: WORKER PROCESS EXITING 46400 1727204525.15728: no more pending results, returning what we have 46400 1727204525.15733: in VariableManager get_vars() 46400 1727204525.15772: Calling all_inventory to load vars for managed-node2 46400 1727204525.15775: Calling groups_inventory to load vars for managed-node2 46400 1727204525.15778: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.15789: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.15792: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.15795: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.17358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.19033: done with get_vars() 46400 1727204525.19055: variable 'ansible_search_path' from source: unknown 46400 1727204525.19056: variable 'ansible_search_path' from source: unknown 46400 1727204525.19097: we have included files to process 46400 1727204525.19099: generating all_blocks data 46400 1727204525.19100: done generating all_blocks data 46400 1727204525.19105: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204525.19107: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204525.19108: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204525.19286: in VariableManager get_vars() 46400 1727204525.19308: done with get_vars() 46400 1727204525.19428: done processing included file 46400 1727204525.19431: iterating over new_blocks loaded from include file 46400 1727204525.19432: in VariableManager get_vars() 46400 1727204525.19452: done with get_vars() 46400 1727204525.19454: filtering new block on tags 46400 1727204525.19492: done filtering new block on tags 46400 1727204525.19495: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 46400 1727204525.19500: extending task lists for all hosts with included blocks 46400 1727204525.20933: done extending task lists 46400 1727204525.20935: done processing included files 46400 1727204525.20936: results queue empty 46400 1727204525.20936: checking for any_errors_fatal 46400 1727204525.20941: done checking for any_errors_fatal 46400 1727204525.20941: checking for max_fail_percentage 46400 1727204525.20942: done checking for max_fail_percentage 46400 1727204525.20943: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.20944: done checking to see if all hosts have failed 46400 1727204525.20945: getting the remaining hosts for this loop 46400 1727204525.20946: done getting the remaining hosts for this loop 46400 1727204525.20949: getting the next task for host managed-node2 46400 1727204525.20953: done getting next task for host managed-node2 46400 1727204525.20955: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204525.20958: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.20969: getting variables 46400 1727204525.20970: in VariableManager get_vars() 46400 1727204525.20984: Calling all_inventory to load vars for managed-node2 46400 1727204525.20991: Calling groups_inventory to load vars for managed-node2 46400 1727204525.20993: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.21001: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.21004: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.21007: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.23190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.25218: done with get_vars() 46400 1727204525.25259: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.141) 0:00:15.538 ***** 46400 1727204525.25402: entering _queue_task() for managed-node2/include_tasks 46400 1727204525.25766: worker is 1 (out of 1 available) 46400 1727204525.25779: exiting _queue_task() for managed-node2/include_tasks 46400 1727204525.25794: done queuing things up, now waiting for results queue to drain 46400 1727204525.25796: waiting for pending results... 46400 1727204525.26658: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204525.26776: in run() - task 0affcd87-79f5-1303-fda8-000000000452 46400 1727204525.26786: variable 'ansible_search_path' from source: unknown 46400 1727204525.26789: variable 'ansible_search_path' from source: unknown 46400 1727204525.26831: calling self._execute() 46400 1727204525.26937: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.26941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.26953: variable 'omit' from source: magic vars 46400 1727204525.27326: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.27338: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.27348: _execute() done 46400 1727204525.27352: dumping result to json 46400 1727204525.27354: done dumping result, returning 46400 1727204525.27359: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-000000000452] 46400 1727204525.27371: sending task result for task 0affcd87-79f5-1303-fda8-000000000452 46400 1727204525.27466: done sending task result for task 0affcd87-79f5-1303-fda8-000000000452 46400 1727204525.27469: WORKER PROCESS EXITING 46400 1727204525.27495: no more pending results, returning what we have 46400 1727204525.27500: in VariableManager get_vars() 46400 1727204525.27539: Calling all_inventory to load vars for managed-node2 46400 1727204525.27541: Calling groups_inventory to load vars for managed-node2 46400 1727204525.27545: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.27559: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.27562: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.27567: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.30226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.32616: done with get_vars() 46400 1727204525.32642: variable 'ansible_search_path' from source: unknown 46400 1727204525.32643: variable 'ansible_search_path' from source: unknown 46400 1727204525.32801: variable 'item' from source: include params 46400 1727204525.32840: we have included files to process 46400 1727204525.32842: generating all_blocks data 46400 1727204525.32844: done generating all_blocks data 46400 1727204525.32846: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204525.32847: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204525.32848: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204525.33055: done processing included file 46400 1727204525.33057: iterating over new_blocks loaded from include file 46400 1727204525.33058: in VariableManager get_vars() 46400 1727204525.33081: done with get_vars() 46400 1727204525.33083: filtering new block on tags 46400 1727204525.33115: done filtering new block on tags 46400 1727204525.33118: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204525.33126: extending task lists for all hosts with included blocks 46400 1727204525.33628: done extending task lists 46400 1727204525.33630: done processing included files 46400 1727204525.33631: results queue empty 46400 1727204525.33631: checking for any_errors_fatal 46400 1727204525.33636: done checking for any_errors_fatal 46400 1727204525.33636: checking for max_fail_percentage 46400 1727204525.33637: done checking for max_fail_percentage 46400 1727204525.33639: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.33640: done checking to see if all hosts have failed 46400 1727204525.33641: getting the remaining hosts for this loop 46400 1727204525.33642: done getting the remaining hosts for this loop 46400 1727204525.33645: getting the next task for host managed-node2 46400 1727204525.33650: done getting next task for host managed-node2 46400 1727204525.33652: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204525.33655: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.33658: getting variables 46400 1727204525.33659: in VariableManager get_vars() 46400 1727204525.33672: Calling all_inventory to load vars for managed-node2 46400 1727204525.33675: Calling groups_inventory to load vars for managed-node2 46400 1727204525.33677: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.33683: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.33685: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.33688: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.35163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.36973: done with get_vars() 46400 1727204525.37008: done getting variables 46400 1727204525.37155: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.117) 0:00:15.656 ***** 46400 1727204525.37229: entering _queue_task() for managed-node2/stat 46400 1727204525.38144: worker is 1 (out of 1 available) 46400 1727204525.38157: exiting _queue_task() for managed-node2/stat 46400 1727204525.38176: done queuing things up, now waiting for results queue to drain 46400 1727204525.38178: waiting for pending results... 46400 1727204525.38526: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204525.38662: in run() - task 0affcd87-79f5-1303-fda8-0000000004e8 46400 1727204525.38684: variable 'ansible_search_path' from source: unknown 46400 1727204525.38688: variable 'ansible_search_path' from source: unknown 46400 1727204525.38720: calling self._execute() 46400 1727204525.38822: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.38828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.38837: variable 'omit' from source: magic vars 46400 1727204525.39230: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.39243: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.39250: variable 'omit' from source: magic vars 46400 1727204525.39318: variable 'omit' from source: magic vars 46400 1727204525.39427: variable 'interface' from source: play vars 46400 1727204525.39443: variable 'omit' from source: magic vars 46400 1727204525.39492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204525.39533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204525.39557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204525.39584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.39594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.39629: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204525.39633: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.39636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.39740: Set connection var ansible_shell_type to sh 46400 1727204525.39750: Set connection var ansible_shell_executable to /bin/sh 46400 1727204525.39756: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204525.39763: Set connection var ansible_connection to ssh 46400 1727204525.39773: Set connection var ansible_pipelining to False 46400 1727204525.39778: Set connection var ansible_timeout to 10 46400 1727204525.39805: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.39808: variable 'ansible_connection' from source: unknown 46400 1727204525.39811: variable 'ansible_module_compression' from source: unknown 46400 1727204525.39813: variable 'ansible_shell_type' from source: unknown 46400 1727204525.39816: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.39818: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.39820: variable 'ansible_pipelining' from source: unknown 46400 1727204525.39823: variable 'ansible_timeout' from source: unknown 46400 1727204525.39828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.40162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204525.40400: variable 'omit' from source: magic vars 46400 1727204525.40407: starting attempt loop 46400 1727204525.40410: running the handler 46400 1727204525.40428: _low_level_execute_command(): starting 46400 1727204525.40435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204525.41776: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204525.41794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.41809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.41830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.41879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.41890: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204525.41903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.41920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204525.41931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204525.41947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204525.41965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.41980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.41995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.42007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.42017: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204525.42030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.42107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.42124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.42140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.42229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.43884: stdout chunk (state=3): >>>/root <<< 46400 1727204525.44102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204525.44106: stdout chunk (state=3): >>><<< 46400 1727204525.44108: stderr chunk (state=3): >>><<< 46400 1727204525.44242: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204525.44246: _low_level_execute_command(): starting 46400 1727204525.44250: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875 `" && echo ansible-tmp-1727204525.4412944-47467-111992264805875="` echo /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875 `" ) && sleep 0' 46400 1727204525.45799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.45803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.45840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204525.45845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.45848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.45920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.45923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.46039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.46141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.47993: stdout chunk (state=3): >>>ansible-tmp-1727204525.4412944-47467-111992264805875=/root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875 <<< 46400 1727204525.48188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204525.48216: stderr chunk (state=3): >>><<< 46400 1727204525.48219: stdout chunk (state=3): >>><<< 46400 1727204525.48384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204525.4412944-47467-111992264805875=/root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204525.48388: variable 'ansible_module_compression' from source: unknown 46400 1727204525.48390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204525.48397: variable 'ansible_facts' from source: unknown 46400 1727204525.48474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/AnsiballZ_stat.py 46400 1727204525.48588: Sending initial data 46400 1727204525.48591: Sent initial data (153 bytes) 46400 1727204525.49249: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.49255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.49294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.49301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.49318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.49321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.49388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.49391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.49394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.49440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.51316: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204525.51706: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204525.51749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp490bq8cm /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/AnsiballZ_stat.py <<< 46400 1727204525.51792: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204525.53158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204525.53185: stderr chunk (state=3): >>><<< 46400 1727204525.53188: stdout chunk (state=3): >>><<< 46400 1727204525.53195: done transferring module to remote 46400 1727204525.53209: _low_level_execute_command(): starting 46400 1727204525.53215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/ /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/AnsiballZ_stat.py && sleep 0' 46400 1727204525.54384: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.54400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.54431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.54506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.54510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204525.54532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.54566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204525.54573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.54667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.54685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.54703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.54770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.56536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204525.56643: stderr chunk (state=3): >>><<< 46400 1727204525.56648: stdout chunk (state=3): >>><<< 46400 1727204525.56680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204525.56683: _low_level_execute_command(): starting 46400 1727204525.56688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/AnsiballZ_stat.py && sleep 0' 46400 1727204525.57851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.58019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.58022: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204525.58026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.58037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204525.58041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204525.58060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204525.58063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.58084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.58107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.58110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.58139: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204525.58142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.58288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.58291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.58293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.58409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.71733: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32550, "dev": 21, "nlink": 1, "atime": 1727204521.8157551, "mtime": 1727204521.8157551, "ctime": 1727204521.8157551, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204525.72824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204525.72830: stdout chunk (state=3): >>><<< 46400 1727204525.72833: stderr chunk (state=3): >>><<< 46400 1727204525.72866: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32550, "dev": 21, "nlink": 1, "atime": 1727204521.8157551, "mtime": 1727204521.8157551, "ctime": 1727204521.8157551, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204525.72919: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204525.72939: _low_level_execute_command(): starting 46400 1727204525.72943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204525.4412944-47467-111992264805875/ > /dev/null 2>&1 && sleep 0' 46400 1727204525.73628: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204525.73636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.73646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.73674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.73707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.73714: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204525.73723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.73738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204525.73745: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204525.73748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204525.73779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204525.73783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204525.73799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204525.73802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204525.73804: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204525.73815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204525.73887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204525.73902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204525.73906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204525.74026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204525.75832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204525.75927: stderr chunk (state=3): >>><<< 46400 1727204525.75930: stdout chunk (state=3): >>><<< 46400 1727204525.75946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204525.75954: handler run complete 46400 1727204525.76002: attempt loop complete, returning result 46400 1727204525.76005: _execute() done 46400 1727204525.76007: dumping result to json 46400 1727204525.76012: done dumping result, returning 46400 1727204525.76020: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-0000000004e8] 46400 1727204525.76026: sending task result for task 0affcd87-79f5-1303-fda8-0000000004e8 ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204521.8157551, "block_size": 4096, "blocks": 0, "ctime": 1727204521.8157551, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204521.8157551, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 46400 1727204525.76258: no more pending results, returning what we have 46400 1727204525.76268: results queue empty 46400 1727204525.76269: checking for any_errors_fatal 46400 1727204525.76271: done checking for any_errors_fatal 46400 1727204525.76272: checking for max_fail_percentage 46400 1727204525.76273: done checking for max_fail_percentage 46400 1727204525.76274: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.76275: done checking to see if all hosts have failed 46400 1727204525.76276: getting the remaining hosts for this loop 46400 1727204525.76277: done getting the remaining hosts for this loop 46400 1727204525.76282: getting the next task for host managed-node2 46400 1727204525.76306: done getting next task for host managed-node2 46400 1727204525.76315: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 46400 1727204525.76318: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.76323: getting variables 46400 1727204525.76324: in VariableManager get_vars() 46400 1727204525.76370: Calling all_inventory to load vars for managed-node2 46400 1727204525.76373: Calling groups_inventory to load vars for managed-node2 46400 1727204525.76376: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.76386: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.76389: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.76392: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.76947: done sending task result for task 0affcd87-79f5-1303-fda8-0000000004e8 46400 1727204525.76957: WORKER PROCESS EXITING 46400 1727204525.79236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.81485: done with get_vars() 46400 1727204525.81521: done getting variables 46400 1727204525.81604: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204525.81733: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.445) 0:00:16.102 ***** 46400 1727204525.81774: entering _queue_task() for managed-node2/assert 46400 1727204525.82116: worker is 1 (out of 1 available) 46400 1727204525.82128: exiting _queue_task() for managed-node2/assert 46400 1727204525.82144: done queuing things up, now waiting for results queue to drain 46400 1727204525.82145: waiting for pending results... 46400 1727204525.82457: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 46400 1727204525.82563: in run() - task 0affcd87-79f5-1303-fda8-000000000453 46400 1727204525.82579: variable 'ansible_search_path' from source: unknown 46400 1727204525.82582: variable 'ansible_search_path' from source: unknown 46400 1727204525.82624: calling self._execute() 46400 1727204525.82719: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.82724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.82736: variable 'omit' from source: magic vars 46400 1727204525.83157: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.83175: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.83179: variable 'omit' from source: magic vars 46400 1727204525.83232: variable 'omit' from source: magic vars 46400 1727204525.83333: variable 'interface' from source: play vars 46400 1727204525.83352: variable 'omit' from source: magic vars 46400 1727204525.83398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204525.83445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204525.83456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204525.83484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.83496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.83524: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204525.83527: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.83531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.83658: Set connection var ansible_shell_type to sh 46400 1727204525.83661: Set connection var ansible_shell_executable to /bin/sh 46400 1727204525.83673: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204525.83683: Set connection var ansible_connection to ssh 46400 1727204525.83689: Set connection var ansible_pipelining to False 46400 1727204525.83694: Set connection var ansible_timeout to 10 46400 1727204525.83719: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.83723: variable 'ansible_connection' from source: unknown 46400 1727204525.83725: variable 'ansible_module_compression' from source: unknown 46400 1727204525.83728: variable 'ansible_shell_type' from source: unknown 46400 1727204525.83730: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.83732: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.83735: variable 'ansible_pipelining' from source: unknown 46400 1727204525.83737: variable 'ansible_timeout' from source: unknown 46400 1727204525.83768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.83894: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204525.83920: variable 'omit' from source: magic vars 46400 1727204525.83924: starting attempt loop 46400 1727204525.83927: running the handler 46400 1727204525.84082: variable 'interface_stat' from source: set_fact 46400 1727204525.84118: Evaluated conditional (interface_stat.stat.exists): True 46400 1727204525.84121: handler run complete 46400 1727204525.84124: attempt loop complete, returning result 46400 1727204525.84126: _execute() done 46400 1727204525.84129: dumping result to json 46400 1727204525.84131: done dumping result, returning 46400 1727204525.84148: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [0affcd87-79f5-1303-fda8-000000000453] 46400 1727204525.84152: sending task result for task 0affcd87-79f5-1303-fda8-000000000453 46400 1727204525.84236: done sending task result for task 0affcd87-79f5-1303-fda8-000000000453 46400 1727204525.84239: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204525.84315: no more pending results, returning what we have 46400 1727204525.84319: results queue empty 46400 1727204525.84320: checking for any_errors_fatal 46400 1727204525.84332: done checking for any_errors_fatal 46400 1727204525.84333: checking for max_fail_percentage 46400 1727204525.84335: done checking for max_fail_percentage 46400 1727204525.84336: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.84337: done checking to see if all hosts have failed 46400 1727204525.84338: getting the remaining hosts for this loop 46400 1727204525.84340: done getting the remaining hosts for this loop 46400 1727204525.84344: getting the next task for host managed-node2 46400 1727204525.84353: done getting next task for host managed-node2 46400 1727204525.84357: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204525.84362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.84368: getting variables 46400 1727204525.84370: in VariableManager get_vars() 46400 1727204525.84402: Calling all_inventory to load vars for managed-node2 46400 1727204525.84405: Calling groups_inventory to load vars for managed-node2 46400 1727204525.84409: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.84421: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.84423: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.84426: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.86911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.89155: done with get_vars() 46400 1727204525.89191: done getting variables 46400 1727204525.89267: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204525.89421: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.076) 0:00:16.179 ***** 46400 1727204525.89473: entering _queue_task() for managed-node2/debug 46400 1727204525.89984: worker is 1 (out of 1 available) 46400 1727204525.89998: exiting _queue_task() for managed-node2/debug 46400 1727204525.90011: done queuing things up, now waiting for results queue to drain 46400 1727204525.90013: waiting for pending results... 46400 1727204525.90348: running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile' 46400 1727204525.91205: in run() - task 0affcd87-79f5-1303-fda8-000000000098 46400 1727204525.91212: variable 'ansible_search_path' from source: unknown 46400 1727204525.91330: variable 'ansible_search_path' from source: unknown 46400 1727204525.91370: calling self._execute() 46400 1727204525.91466: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.91475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.91487: variable 'omit' from source: magic vars 46400 1727204525.91892: variable 'ansible_distribution_major_version' from source: facts 46400 1727204525.91905: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204525.91911: variable 'omit' from source: magic vars 46400 1727204525.91954: variable 'omit' from source: magic vars 46400 1727204525.92058: variable 'lsr_description' from source: include params 46400 1727204525.92081: variable 'omit' from source: magic vars 46400 1727204525.92126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204525.92156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204525.92183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204525.92315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.92325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204525.92353: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204525.92357: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.92360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.92599: Set connection var ansible_shell_type to sh 46400 1727204525.92602: Set connection var ansible_shell_executable to /bin/sh 46400 1727204525.92605: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204525.92607: Set connection var ansible_connection to ssh 46400 1727204525.92617: Set connection var ansible_pipelining to False 46400 1727204525.92619: Set connection var ansible_timeout to 10 46400 1727204525.92760: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.92764: variable 'ansible_connection' from source: unknown 46400 1727204525.92766: variable 'ansible_module_compression' from source: unknown 46400 1727204525.92771: variable 'ansible_shell_type' from source: unknown 46400 1727204525.92773: variable 'ansible_shell_executable' from source: unknown 46400 1727204525.92776: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204525.92778: variable 'ansible_pipelining' from source: unknown 46400 1727204525.92781: variable 'ansible_timeout' from source: unknown 46400 1727204525.92786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204525.93032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204525.93142: variable 'omit' from source: magic vars 46400 1727204525.93341: starting attempt loop 46400 1727204525.93506: running the handler 46400 1727204525.93545: handler run complete 46400 1727204525.93549: attempt loop complete, returning result 46400 1727204525.93552: _execute() done 46400 1727204525.93557: dumping result to json 46400 1727204525.93560: done dumping result, returning 46400 1727204525.93563: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile' [0affcd87-79f5-1303-fda8-000000000098] 46400 1727204525.93566: sending task result for task 0affcd87-79f5-1303-fda8-000000000098 46400 1727204525.93642: done sending task result for task 0affcd87-79f5-1303-fda8-000000000098 46400 1727204525.93647: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 46400 1727204525.93703: no more pending results, returning what we have 46400 1727204525.93707: results queue empty 46400 1727204525.93708: checking for any_errors_fatal 46400 1727204525.93715: done checking for any_errors_fatal 46400 1727204525.93716: checking for max_fail_percentage 46400 1727204525.93718: done checking for max_fail_percentage 46400 1727204525.93719: checking to see if all hosts have failed and the running result is not ok 46400 1727204525.93720: done checking to see if all hosts have failed 46400 1727204525.93720: getting the remaining hosts for this loop 46400 1727204525.93723: done getting the remaining hosts for this loop 46400 1727204525.93727: getting the next task for host managed-node2 46400 1727204525.93737: done getting next task for host managed-node2 46400 1727204525.93740: ^ task is: TASK: Cleanup 46400 1727204525.93744: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204525.93749: getting variables 46400 1727204525.93751: in VariableManager get_vars() 46400 1727204525.93789: Calling all_inventory to load vars for managed-node2 46400 1727204525.93792: Calling groups_inventory to load vars for managed-node2 46400 1727204525.93796: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204525.93813: Calling all_plugins_play to load vars for managed-node2 46400 1727204525.93822: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204525.93828: Calling groups_plugins_play to load vars for managed-node2 46400 1727204525.96243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204525.99070: done with get_vars() 46400 1727204525.99093: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.097) 0:00:16.276 ***** 46400 1727204525.99194: entering _queue_task() for managed-node2/include_tasks 46400 1727204525.99533: worker is 1 (out of 1 available) 46400 1727204525.99548: exiting _queue_task() for managed-node2/include_tasks 46400 1727204525.99568: done queuing things up, now waiting for results queue to drain 46400 1727204525.99570: waiting for pending results... 46400 1727204525.99895: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204526.00019: in run() - task 0affcd87-79f5-1303-fda8-00000000009c 46400 1727204526.00032: variable 'ansible_search_path' from source: unknown 46400 1727204526.00035: variable 'ansible_search_path' from source: unknown 46400 1727204526.00085: variable 'lsr_cleanup' from source: include params 46400 1727204526.00353: variable 'lsr_cleanup' from source: include params 46400 1727204526.00423: variable 'omit' from source: magic vars 46400 1727204526.00676: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204526.00680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204526.00684: variable 'omit' from source: magic vars 46400 1727204526.02061: variable 'ansible_distribution_major_version' from source: facts 46400 1727204526.02076: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204526.02088: variable 'item' from source: unknown 46400 1727204526.02153: variable 'item' from source: unknown 46400 1727204526.02193: variable 'item' from source: unknown 46400 1727204526.02253: variable 'item' from source: unknown 46400 1727204526.02382: dumping result to json 46400 1727204526.02385: done dumping result, returning 46400 1727204526.02387: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-00000000009c] 46400 1727204526.02389: sending task result for task 0affcd87-79f5-1303-fda8-00000000009c 46400 1727204526.02426: done sending task result for task 0affcd87-79f5-1303-fda8-00000000009c 46400 1727204526.02429: WORKER PROCESS EXITING 46400 1727204526.02510: no more pending results, returning what we have 46400 1727204526.02516: in VariableManager get_vars() 46400 1727204526.02556: Calling all_inventory to load vars for managed-node2 46400 1727204526.02561: Calling groups_inventory to load vars for managed-node2 46400 1727204526.02567: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204526.02580: Calling all_plugins_play to load vars for managed-node2 46400 1727204526.02584: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204526.02587: Calling groups_plugins_play to load vars for managed-node2 46400 1727204526.12923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204526.16438: done with get_vars() 46400 1727204526.17175: variable 'ansible_search_path' from source: unknown 46400 1727204526.17177: variable 'ansible_search_path' from source: unknown 46400 1727204526.17217: we have included files to process 46400 1727204526.17218: generating all_blocks data 46400 1727204526.17220: done generating all_blocks data 46400 1727204526.17223: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204526.17224: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204526.17226: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204526.17456: done processing included file 46400 1727204526.17458: iterating over new_blocks loaded from include file 46400 1727204526.17462: in VariableManager get_vars() 46400 1727204526.17479: done with get_vars() 46400 1727204526.17481: filtering new block on tags 46400 1727204526.17505: done filtering new block on tags 46400 1727204526.17507: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204526.17511: extending task lists for all hosts with included blocks 46400 1727204526.20041: done extending task lists 46400 1727204526.20043: done processing included files 46400 1727204526.20044: results queue empty 46400 1727204526.20045: checking for any_errors_fatal 46400 1727204526.20048: done checking for any_errors_fatal 46400 1727204526.20049: checking for max_fail_percentage 46400 1727204526.20050: done checking for max_fail_percentage 46400 1727204526.20051: checking to see if all hosts have failed and the running result is not ok 46400 1727204526.20052: done checking to see if all hosts have failed 46400 1727204526.20053: getting the remaining hosts for this loop 46400 1727204526.20054: done getting the remaining hosts for this loop 46400 1727204526.20056: getting the next task for host managed-node2 46400 1727204526.20065: done getting next task for host managed-node2 46400 1727204526.20067: ^ task is: TASK: Cleanup profile and device 46400 1727204526.20070: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204526.20072: getting variables 46400 1727204526.20074: in VariableManager get_vars() 46400 1727204526.20086: Calling all_inventory to load vars for managed-node2 46400 1727204526.20088: Calling groups_inventory to load vars for managed-node2 46400 1727204526.20091: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204526.20097: Calling all_plugins_play to load vars for managed-node2 46400 1727204526.20100: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204526.20103: Calling groups_plugins_play to load vars for managed-node2 46400 1727204526.23112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204526.26399: done with get_vars() 46400 1727204526.26428: done getting variables 46400 1727204526.26689: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.275) 0:00:16.551 ***** 46400 1727204526.26719: entering _queue_task() for managed-node2/shell 46400 1727204526.27274: worker is 1 (out of 1 available) 46400 1727204526.27286: exiting _queue_task() for managed-node2/shell 46400 1727204526.27298: done queuing things up, now waiting for results queue to drain 46400 1727204526.27300: waiting for pending results... 46400 1727204526.28295: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204526.28528: in run() - task 0affcd87-79f5-1303-fda8-00000000050b 46400 1727204526.28539: variable 'ansible_search_path' from source: unknown 46400 1727204526.28543: variable 'ansible_search_path' from source: unknown 46400 1727204526.28781: calling self._execute() 46400 1727204526.28873: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204526.28957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204526.28994: variable 'omit' from source: magic vars 46400 1727204526.29900: variable 'ansible_distribution_major_version' from source: facts 46400 1727204526.29913: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204526.29919: variable 'omit' from source: magic vars 46400 1727204526.30076: variable 'omit' from source: magic vars 46400 1727204526.30463: variable 'interface' from source: play vars 46400 1727204526.30486: variable 'omit' from source: magic vars 46400 1727204526.30641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204526.30683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204526.30705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204526.30722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204526.30848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204526.30881: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204526.30885: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204526.30888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204526.30984: Set connection var ansible_shell_type to sh 46400 1727204526.30992: Set connection var ansible_shell_executable to /bin/sh 46400 1727204526.30998: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204526.31003: Set connection var ansible_connection to ssh 46400 1727204526.31009: Set connection var ansible_pipelining to False 46400 1727204526.31014: Set connection var ansible_timeout to 10 46400 1727204526.31044: variable 'ansible_shell_executable' from source: unknown 46400 1727204526.31048: variable 'ansible_connection' from source: unknown 46400 1727204526.31051: variable 'ansible_module_compression' from source: unknown 46400 1727204526.31054: variable 'ansible_shell_type' from source: unknown 46400 1727204526.31056: variable 'ansible_shell_executable' from source: unknown 46400 1727204526.31059: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204526.31290: variable 'ansible_pipelining' from source: unknown 46400 1727204526.31294: variable 'ansible_timeout' from source: unknown 46400 1727204526.31296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204526.31554: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204526.31570: variable 'omit' from source: magic vars 46400 1727204526.31575: starting attempt loop 46400 1727204526.31578: running the handler 46400 1727204526.31588: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204526.31677: _low_level_execute_command(): starting 46400 1727204526.31685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204526.34203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.34216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.34235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.34248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.34294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.34342: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.34352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.34457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.34470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.34478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.34486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.34496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.34507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.34516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.34522: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.34532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.34604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.34618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.34683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.34881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.36497: stdout chunk (state=3): >>>/root <<< 46400 1727204526.36679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204526.36685: stdout chunk (state=3): >>><<< 46400 1727204526.36694: stderr chunk (state=3): >>><<< 46400 1727204526.36719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204526.36732: _low_level_execute_command(): starting 46400 1727204526.36740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901 `" && echo ansible-tmp-1727204526.3671694-47505-84382813385901="` echo /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901 `" ) && sleep 0' 46400 1727204526.38207: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.38216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.38228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.38247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.38291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.38299: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.38309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.38323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.38330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.38337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.38347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.38359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.38379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.38386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.38393: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.38404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.38491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.38498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.38508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.38606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.40486: stdout chunk (state=3): >>>ansible-tmp-1727204526.3671694-47505-84382813385901=/root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901 <<< 46400 1727204526.40674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204526.40677: stdout chunk (state=3): >>><<< 46400 1727204526.40687: stderr chunk (state=3): >>><<< 46400 1727204526.40717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204526.3671694-47505-84382813385901=/root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204526.40750: variable 'ansible_module_compression' from source: unknown 46400 1727204526.40812: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204526.40848: variable 'ansible_facts' from source: unknown 46400 1727204526.40931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/AnsiballZ_command.py 46400 1727204526.41604: Sending initial data 46400 1727204526.41607: Sent initial data (155 bytes) 46400 1727204526.45051: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.45068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.45083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.45098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.45143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.45150: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.45165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.45181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.45188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.45195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.45202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.45211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.45227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.45235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.45242: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.45251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.45452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.45473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.45483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.45670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.47395: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204526.47484: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204526.47488: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp61r4w1xo /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/AnsiballZ_command.py <<< 46400 1727204526.47519: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204526.48803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204526.49104: stderr chunk (state=3): >>><<< 46400 1727204526.49108: stdout chunk (state=3): >>><<< 46400 1727204526.49111: done transferring module to remote 46400 1727204526.49113: _low_level_execute_command(): starting 46400 1727204526.49115: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/ /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/AnsiballZ_command.py && sleep 0' 46400 1727204526.49731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.49748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.49767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.49797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.49838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.49851: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.49874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.49899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.49917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.49930: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.49942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.49956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.49978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.49991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.50007: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.50028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.50105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.50137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.50155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.50234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.52070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204526.52167: stderr chunk (state=3): >>><<< 46400 1727204526.52180: stdout chunk (state=3): >>><<< 46400 1727204526.52280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204526.52285: _low_level_execute_command(): starting 46400 1727204526.52288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/AnsiballZ_command.py && sleep 0' 46400 1727204526.52943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.52956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.52975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.52992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.53041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.53053: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.53072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.53091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.53103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.53115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.53127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.53142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.53172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.53187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.53199: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.53214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.53305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.53327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.53343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.53422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.72708: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (994c2922-44e9-4ac5-9912-c2f948bcac87) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:06.667560", "end": "2024-09-24 15:02:06.725962", "delta": "0:00:00.058402", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204526.74001: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204526.74005: stdout chunk (state=3): >>><<< 46400 1727204526.74007: stderr chunk (state=3): >>><<< 46400 1727204526.74071: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (994c2922-44e9-4ac5-9912-c2f948bcac87) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:06.667560", "end": "2024-09-24 15:02:06.725962", "delta": "0:00:00.058402", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204526.74125: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204526.74130: _low_level_execute_command(): starting 46400 1727204526.74133: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204526.3671694-47505-84382813385901/ > /dev/null 2>&1 && sleep 0' 46400 1727204526.75329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204526.75365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.75414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.75433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.75478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.75506: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204526.75521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.75538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204526.75549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204526.75558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204526.75571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204526.75583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204526.75597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204526.75606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204526.75615: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204526.75624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204526.75695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204526.75710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204526.75723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204526.75804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204526.77713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204526.77716: stdout chunk (state=3): >>><<< 46400 1727204526.77727: stderr chunk (state=3): >>><<< 46400 1727204526.77742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204526.77748: handler run complete 46400 1727204526.77774: Evaluated conditional (False): False 46400 1727204526.77784: attempt loop complete, returning result 46400 1727204526.77787: _execute() done 46400 1727204526.77789: dumping result to json 46400 1727204526.77795: done dumping result, returning 46400 1727204526.77804: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-00000000050b] 46400 1727204526.77812: sending task result for task 0affcd87-79f5-1303-fda8-00000000050b 46400 1727204526.77915: done sending task result for task 0affcd87-79f5-1303-fda8-00000000050b 46400 1727204526.77918: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.058402", "end": "2024-09-24 15:02:06.725962", "rc": 1, "start": "2024-09-24 15:02:06.667560" } STDOUT: Connection 'statebr' (994c2922-44e9-4ac5-9912-c2f948bcac87) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204526.77984: no more pending results, returning what we have 46400 1727204526.77989: results queue empty 46400 1727204526.77990: checking for any_errors_fatal 46400 1727204526.77992: done checking for any_errors_fatal 46400 1727204526.77992: checking for max_fail_percentage 46400 1727204526.77994: done checking for max_fail_percentage 46400 1727204526.77995: checking to see if all hosts have failed and the running result is not ok 46400 1727204526.77996: done checking to see if all hosts have failed 46400 1727204526.77996: getting the remaining hosts for this loop 46400 1727204526.77998: done getting the remaining hosts for this loop 46400 1727204526.78002: getting the next task for host managed-node2 46400 1727204526.78014: done getting next task for host managed-node2 46400 1727204526.78017: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204526.78019: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204526.78022: getting variables 46400 1727204526.78024: in VariableManager get_vars() 46400 1727204526.78062: Calling all_inventory to load vars for managed-node2 46400 1727204526.78067: Calling groups_inventory to load vars for managed-node2 46400 1727204526.78070: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204526.78080: Calling all_plugins_play to load vars for managed-node2 46400 1727204526.78083: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204526.78085: Calling groups_plugins_play to load vars for managed-node2 46400 1727204526.80278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204526.82658: done with get_vars() 46400 1727204526.82690: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.560) 0:00:17.112 ***** 46400 1727204526.82802: entering _queue_task() for managed-node2/include_tasks 46400 1727204526.83629: worker is 1 (out of 1 available) 46400 1727204526.83648: exiting _queue_task() for managed-node2/include_tasks 46400 1727204526.83687: done queuing things up, now waiting for results queue to drain 46400 1727204526.83689: waiting for pending results... 46400 1727204526.84213: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204526.84365: in run() - task 0affcd87-79f5-1303-fda8-00000000000f 46400 1727204526.84437: variable 'ansible_search_path' from source: unknown 46400 1727204526.84554: calling self._execute() 46400 1727204526.84739: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204526.84750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204526.84765: variable 'omit' from source: magic vars 46400 1727204526.85522: variable 'ansible_distribution_major_version' from source: facts 46400 1727204526.85540: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204526.85553: _execute() done 46400 1727204526.85560: dumping result to json 46400 1727204526.85569: done dumping result, returning 46400 1727204526.85579: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-00000000000f] 46400 1727204526.85588: sending task result for task 0affcd87-79f5-1303-fda8-00000000000f 46400 1727204526.85766: no more pending results, returning what we have 46400 1727204526.85772: in VariableManager get_vars() 46400 1727204526.85811: Calling all_inventory to load vars for managed-node2 46400 1727204526.85815: Calling groups_inventory to load vars for managed-node2 46400 1727204526.85818: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204526.85835: Calling all_plugins_play to load vars for managed-node2 46400 1727204526.85838: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204526.85842: Calling groups_plugins_play to load vars for managed-node2 46400 1727204526.88597: done sending task result for task 0affcd87-79f5-1303-fda8-00000000000f 46400 1727204526.88601: WORKER PROCESS EXITING 46400 1727204526.91362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204526.93298: done with get_vars() 46400 1727204526.93321: variable 'ansible_search_path' from source: unknown 46400 1727204526.93338: we have included files to process 46400 1727204526.93339: generating all_blocks data 46400 1727204526.93341: done generating all_blocks data 46400 1727204526.93347: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204526.93348: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204526.93350: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204526.93787: in VariableManager get_vars() 46400 1727204526.93806: done with get_vars() 46400 1727204526.93855: in VariableManager get_vars() 46400 1727204526.93872: done with get_vars() 46400 1727204526.93911: in VariableManager get_vars() 46400 1727204526.93936: done with get_vars() 46400 1727204526.93979: in VariableManager get_vars() 46400 1727204526.93995: done with get_vars() 46400 1727204526.94043: in VariableManager get_vars() 46400 1727204526.94059: done with get_vars() 46400 1727204526.94468: in VariableManager get_vars() 46400 1727204526.94483: done with get_vars() 46400 1727204526.94494: done processing included file 46400 1727204526.94495: iterating over new_blocks loaded from include file 46400 1727204526.94496: in VariableManager get_vars() 46400 1727204526.94505: done with get_vars() 46400 1727204526.94506: filtering new block on tags 46400 1727204526.94592: done filtering new block on tags 46400 1727204526.94595: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204526.94600: extending task lists for all hosts with included blocks 46400 1727204526.94630: done extending task lists 46400 1727204526.94631: done processing included files 46400 1727204526.94631: results queue empty 46400 1727204526.94632: checking for any_errors_fatal 46400 1727204526.94636: done checking for any_errors_fatal 46400 1727204526.94637: checking for max_fail_percentage 46400 1727204526.94638: done checking for max_fail_percentage 46400 1727204526.94638: checking to see if all hosts have failed and the running result is not ok 46400 1727204526.94639: done checking to see if all hosts have failed 46400 1727204526.94640: getting the remaining hosts for this loop 46400 1727204526.94641: done getting the remaining hosts for this loop 46400 1727204526.94643: getting the next task for host managed-node2 46400 1727204526.94646: done getting next task for host managed-node2 46400 1727204526.94648: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204526.94650: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204526.94652: getting variables 46400 1727204526.94653: in VariableManager get_vars() 46400 1727204526.94660: Calling all_inventory to load vars for managed-node2 46400 1727204526.94662: Calling groups_inventory to load vars for managed-node2 46400 1727204526.94667: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204526.94675: Calling all_plugins_play to load vars for managed-node2 46400 1727204526.94678: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204526.94681: Calling groups_plugins_play to load vars for managed-node2 46400 1727204526.97398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204526.99227: done with get_vars() 46400 1727204526.99254: done getting variables 46400 1727204526.99305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204526.99440: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.166) 0:00:17.279 ***** 46400 1727204526.99471: entering _queue_task() for managed-node2/debug 46400 1727204526.99833: worker is 1 (out of 1 available) 46400 1727204526.99845: exiting _queue_task() for managed-node2/debug 46400 1727204526.99861: done queuing things up, now waiting for results queue to drain 46400 1727204526.99863: waiting for pending results... 46400 1727204527.00187: running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile without autoconnect 46400 1727204527.00307: in run() - task 0affcd87-79f5-1303-fda8-0000000005b4 46400 1727204527.00328: variable 'ansible_search_path' from source: unknown 46400 1727204527.00335: variable 'ansible_search_path' from source: unknown 46400 1727204527.00375: calling self._execute() 46400 1727204527.00478: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.00490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.00505: variable 'omit' from source: magic vars 46400 1727204527.00931: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.00947: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.00971: variable 'omit' from source: magic vars 46400 1727204527.01013: variable 'omit' from source: magic vars 46400 1727204527.01135: variable 'lsr_description' from source: include params 46400 1727204527.01172: variable 'omit' from source: magic vars 46400 1727204527.01235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204527.01286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.01319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204527.01349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.01376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.01419: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.01428: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.01436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.01567: Set connection var ansible_shell_type to sh 46400 1727204527.01589: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.01618: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.01646: Set connection var ansible_connection to ssh 46400 1727204527.01657: Set connection var ansible_pipelining to False 46400 1727204527.01686: Set connection var ansible_timeout to 10 46400 1727204527.01719: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.01729: variable 'ansible_connection' from source: unknown 46400 1727204527.01741: variable 'ansible_module_compression' from source: unknown 46400 1727204527.01748: variable 'ansible_shell_type' from source: unknown 46400 1727204527.01755: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.01767: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.01796: variable 'ansible_pipelining' from source: unknown 46400 1727204527.01810: variable 'ansible_timeout' from source: unknown 46400 1727204527.01852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.02005: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.02021: variable 'omit' from source: magic vars 46400 1727204527.02029: starting attempt loop 46400 1727204527.02037: running the handler 46400 1727204527.02095: handler run complete 46400 1727204527.02114: attempt loop complete, returning result 46400 1727204527.02121: _execute() done 46400 1727204527.02129: dumping result to json 46400 1727204527.02137: done dumping result, returning 46400 1727204527.02149: done running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile without autoconnect [0affcd87-79f5-1303-fda8-0000000005b4] 46400 1727204527.02211: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b4 46400 1727204527.03092: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b4 46400 1727204527.03098: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can create a profile without autoconnect ########## 46400 1727204527.03144: no more pending results, returning what we have 46400 1727204527.03148: results queue empty 46400 1727204527.03149: checking for any_errors_fatal 46400 1727204527.03151: done checking for any_errors_fatal 46400 1727204527.03152: checking for max_fail_percentage 46400 1727204527.03153: done checking for max_fail_percentage 46400 1727204527.03154: checking to see if all hosts have failed and the running result is not ok 46400 1727204527.03155: done checking to see if all hosts have failed 46400 1727204527.03156: getting the remaining hosts for this loop 46400 1727204527.03157: done getting the remaining hosts for this loop 46400 1727204527.03166: getting the next task for host managed-node2 46400 1727204527.03174: done getting next task for host managed-node2 46400 1727204527.03176: ^ task is: TASK: Show item 46400 1727204527.03179: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204527.03183: getting variables 46400 1727204527.03184: in VariableManager get_vars() 46400 1727204527.03215: Calling all_inventory to load vars for managed-node2 46400 1727204527.03218: Calling groups_inventory to load vars for managed-node2 46400 1727204527.03221: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.03232: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.03235: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.03237: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.05528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.08193: done with get_vars() 46400 1727204527.08225: done getting variables 46400 1727204527.08409: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.089) 0:00:17.368 ***** 46400 1727204527.08455: entering _queue_task() for managed-node2/debug 46400 1727204527.09120: worker is 1 (out of 1 available) 46400 1727204527.09135: exiting _queue_task() for managed-node2/debug 46400 1727204527.09150: done queuing things up, now waiting for results queue to drain 46400 1727204527.09152: waiting for pending results... 46400 1727204527.09468: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204527.09640: in run() - task 0affcd87-79f5-1303-fda8-0000000005b5 46400 1727204527.09662: variable 'ansible_search_path' from source: unknown 46400 1727204527.09676: variable 'ansible_search_path' from source: unknown 46400 1727204527.09753: variable 'omit' from source: magic vars 46400 1727204527.09966: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.09983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.09999: variable 'omit' from source: magic vars 46400 1727204527.10434: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.10454: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.10471: variable 'omit' from source: magic vars 46400 1727204527.10524: variable 'omit' from source: magic vars 46400 1727204527.10585: variable 'item' from source: unknown 46400 1727204527.10684: variable 'item' from source: unknown 46400 1727204527.10714: variable 'omit' from source: magic vars 46400 1727204527.10775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204527.10839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.10872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204527.10895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.10912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.10958: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.10973: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.10983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.11101: Set connection var ansible_shell_type to sh 46400 1727204527.11126: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.11144: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.11162: Set connection var ansible_connection to ssh 46400 1727204527.11176: Set connection var ansible_pipelining to False 46400 1727204527.11186: Set connection var ansible_timeout to 10 46400 1727204527.11212: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.11220: variable 'ansible_connection' from source: unknown 46400 1727204527.11227: variable 'ansible_module_compression' from source: unknown 46400 1727204527.11234: variable 'ansible_shell_type' from source: unknown 46400 1727204527.11244: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.11258: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.11274: variable 'ansible_pipelining' from source: unknown 46400 1727204527.11282: variable 'ansible_timeout' from source: unknown 46400 1727204527.11291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.11444: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.11465: variable 'omit' from source: magic vars 46400 1727204527.11486: starting attempt loop 46400 1727204527.11493: running the handler 46400 1727204527.11554: variable 'lsr_description' from source: include params 46400 1727204527.11650: variable 'lsr_description' from source: include params 46400 1727204527.11671: handler run complete 46400 1727204527.11703: attempt loop complete, returning result 46400 1727204527.11726: variable 'item' from source: unknown 46400 1727204527.11805: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 46400 1727204527.12162: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.12188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.12204: variable 'omit' from source: magic vars 46400 1727204527.12397: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.12408: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.12416: variable 'omit' from source: magic vars 46400 1727204527.12433: variable 'omit' from source: magic vars 46400 1727204527.12492: variable 'item' from source: unknown 46400 1727204527.12571: variable 'item' from source: unknown 46400 1727204527.12590: variable 'omit' from source: magic vars 46400 1727204527.12614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.12627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.12639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.12669: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.12678: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.12685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.12758: Set connection var ansible_shell_type to sh 46400 1727204527.12785: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.12797: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.12807: Set connection var ansible_connection to ssh 46400 1727204527.12815: Set connection var ansible_pipelining to False 46400 1727204527.12824: Set connection var ansible_timeout to 10 46400 1727204527.12849: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.12858: variable 'ansible_connection' from source: unknown 46400 1727204527.12882: variable 'ansible_module_compression' from source: unknown 46400 1727204527.12895: variable 'ansible_shell_type' from source: unknown 46400 1727204527.12900: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.12905: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.12910: variable 'ansible_pipelining' from source: unknown 46400 1727204527.12915: variable 'ansible_timeout' from source: unknown 46400 1727204527.12920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.13013: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.13025: variable 'omit' from source: magic vars 46400 1727204527.13032: starting attempt loop 46400 1727204527.13037: running the handler 46400 1727204527.13059: variable 'lsr_setup' from source: include params 46400 1727204527.13145: variable 'lsr_setup' from source: include params 46400 1727204527.13197: handler run complete 46400 1727204527.13226: attempt loop complete, returning result 46400 1727204527.13245: variable 'item' from source: unknown 46400 1727204527.13315: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 46400 1727204527.13508: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.13523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.13539: variable 'omit' from source: magic vars 46400 1727204527.13716: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.13726: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.13734: variable 'omit' from source: magic vars 46400 1727204527.13751: variable 'omit' from source: magic vars 46400 1727204527.13803: variable 'item' from source: unknown 46400 1727204527.13871: variable 'item' from source: unknown 46400 1727204527.13898: variable 'omit' from source: magic vars 46400 1727204527.13922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.13934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.13944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.13963: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.13974: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.13981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.14071: Set connection var ansible_shell_type to sh 46400 1727204527.14086: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.14103: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.14112: Set connection var ansible_connection to ssh 46400 1727204527.14120: Set connection var ansible_pipelining to False 46400 1727204527.14128: Set connection var ansible_timeout to 10 46400 1727204527.14154: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.14166: variable 'ansible_connection' from source: unknown 46400 1727204527.14174: variable 'ansible_module_compression' from source: unknown 46400 1727204527.14180: variable 'ansible_shell_type' from source: unknown 46400 1727204527.14186: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.14191: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.14198: variable 'ansible_pipelining' from source: unknown 46400 1727204527.14213: variable 'ansible_timeout' from source: unknown 46400 1727204527.14220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.14311: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.14328: variable 'omit' from source: magic vars 46400 1727204527.14335: starting attempt loop 46400 1727204527.14340: running the handler 46400 1727204527.14362: variable 'lsr_test' from source: include params 46400 1727204527.14432: variable 'lsr_test' from source: include params 46400 1727204527.14452: handler run complete 46400 1727204527.14474: attempt loop complete, returning result 46400 1727204527.14492: variable 'item' from source: unknown 46400 1727204527.14568: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 46400 1727204527.14729: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.14741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.14753: variable 'omit' from source: magic vars 46400 1727204527.14928: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.14937: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.14944: variable 'omit' from source: magic vars 46400 1727204527.14962: variable 'omit' from source: magic vars 46400 1727204527.15011: variable 'item' from source: unknown 46400 1727204527.15076: variable 'item' from source: unknown 46400 1727204527.15100: variable 'omit' from source: magic vars 46400 1727204527.15127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.15137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.15146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.15159: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.15171: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.15178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.15257: Set connection var ansible_shell_type to sh 46400 1727204527.15275: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.15284: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.15293: Set connection var ansible_connection to ssh 46400 1727204527.15301: Set connection var ansible_pipelining to False 46400 1727204527.15315: Set connection var ansible_timeout to 10 46400 1727204527.15345: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.15352: variable 'ansible_connection' from source: unknown 46400 1727204527.15358: variable 'ansible_module_compression' from source: unknown 46400 1727204527.15369: variable 'ansible_shell_type' from source: unknown 46400 1727204527.15375: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.15381: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.15387: variable 'ansible_pipelining' from source: unknown 46400 1727204527.15393: variable 'ansible_timeout' from source: unknown 46400 1727204527.15399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.15500: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.15511: variable 'omit' from source: magic vars 46400 1727204527.15519: starting attempt loop 46400 1727204527.15528: running the handler 46400 1727204527.15556: variable 'lsr_assert' from source: include params 46400 1727204527.15623: variable 'lsr_assert' from source: include params 46400 1727204527.15655: handler run complete 46400 1727204527.15678: attempt loop complete, returning result 46400 1727204527.15696: variable 'item' from source: unknown 46400 1727204527.15768: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 46400 1727204527.15925: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.15937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.15949: variable 'omit' from source: magic vars 46400 1727204527.16187: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.16205: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.16220: variable 'omit' from source: magic vars 46400 1727204527.16237: variable 'omit' from source: magic vars 46400 1727204527.16285: variable 'item' from source: unknown 46400 1727204527.16358: variable 'item' from source: unknown 46400 1727204527.16381: variable 'omit' from source: magic vars 46400 1727204527.16406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.16425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.16435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.16449: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.16456: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.16467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.16542: Set connection var ansible_shell_type to sh 46400 1727204527.16555: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.16571: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.16581: Set connection var ansible_connection to ssh 46400 1727204527.16590: Set connection var ansible_pipelining to False 46400 1727204527.16599: Set connection var ansible_timeout to 10 46400 1727204527.16624: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.16637: variable 'ansible_connection' from source: unknown 46400 1727204527.16649: variable 'ansible_module_compression' from source: unknown 46400 1727204527.16655: variable 'ansible_shell_type' from source: unknown 46400 1727204527.16666: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.16673: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.16681: variable 'ansible_pipelining' from source: unknown 46400 1727204527.16687: variable 'ansible_timeout' from source: unknown 46400 1727204527.16694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.16800: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.16813: variable 'omit' from source: magic vars 46400 1727204527.16821: starting attempt loop 46400 1727204527.16828: running the handler 46400 1727204527.16963: handler run complete 46400 1727204527.16995: attempt loop complete, returning result 46400 1727204527.17022: variable 'item' from source: unknown 46400 1727204527.17113: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 46400 1727204527.17299: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.17312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.17324: variable 'omit' from source: magic vars 46400 1727204527.17491: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.17501: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.17510: variable 'omit' from source: magic vars 46400 1727204527.17528: variable 'omit' from source: magic vars 46400 1727204527.17611: variable 'item' from source: unknown 46400 1727204527.18447: variable 'item' from source: unknown 46400 1727204527.18474: variable 'omit' from source: magic vars 46400 1727204527.18531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.18629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.18641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.18659: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.18673: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.18680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.18867: Set connection var ansible_shell_type to sh 46400 1727204527.18880: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.18889: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.18899: Set connection var ansible_connection to ssh 46400 1727204527.18908: Set connection var ansible_pipelining to False 46400 1727204527.18917: Set connection var ansible_timeout to 10 46400 1727204527.18953: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.18965: variable 'ansible_connection' from source: unknown 46400 1727204527.19062: variable 'ansible_module_compression' from source: unknown 46400 1727204527.19074: variable 'ansible_shell_type' from source: unknown 46400 1727204527.19081: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.19089: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.19096: variable 'ansible_pipelining' from source: unknown 46400 1727204527.19102: variable 'ansible_timeout' from source: unknown 46400 1727204527.19110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.19219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.19362: variable 'omit' from source: magic vars 46400 1727204527.19376: starting attempt loop 46400 1727204527.19396: running the handler 46400 1727204527.19424: variable 'lsr_fail_debug' from source: play vars 46400 1727204527.19590: variable 'lsr_fail_debug' from source: play vars 46400 1727204527.19734: handler run complete 46400 1727204527.19753: attempt loop complete, returning result 46400 1727204527.19778: variable 'item' from source: unknown 46400 1727204527.19935: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204527.20208: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.20284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.20299: variable 'omit' from source: magic vars 46400 1727204527.20727: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.20738: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.20746: variable 'omit' from source: magic vars 46400 1727204527.20770: variable 'omit' from source: magic vars 46400 1727204527.20928: variable 'item' from source: unknown 46400 1727204527.21006: variable 'item' from source: unknown 46400 1727204527.21052: variable 'omit' from source: magic vars 46400 1727204527.21171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.21185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.21197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.21217: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.21225: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.21232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.21416: Set connection var ansible_shell_type to sh 46400 1727204527.21430: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.21440: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.21450: Set connection var ansible_connection to ssh 46400 1727204527.21459: Set connection var ansible_pipelining to False 46400 1727204527.21488: Set connection var ansible_timeout to 10 46400 1727204527.21607: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.21615: variable 'ansible_connection' from source: unknown 46400 1727204527.21621: variable 'ansible_module_compression' from source: unknown 46400 1727204527.21628: variable 'ansible_shell_type' from source: unknown 46400 1727204527.21634: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.21643: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.21651: variable 'ansible_pipelining' from source: unknown 46400 1727204527.21658: variable 'ansible_timeout' from source: unknown 46400 1727204527.21671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.21773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.21917: variable 'omit' from source: magic vars 46400 1727204527.21925: starting attempt loop 46400 1727204527.21930: running the handler 46400 1727204527.21950: variable 'lsr_cleanup' from source: include params 46400 1727204527.22135: variable 'lsr_cleanup' from source: include params 46400 1727204527.22157: handler run complete 46400 1727204527.22179: attempt loop complete, returning result 46400 1727204527.22197: variable 'item' from source: unknown 46400 1727204527.22297: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 46400 1727204527.22576: dumping result to json 46400 1727204527.22590: done dumping result, returning 46400 1727204527.22602: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-0000000005b5] 46400 1727204527.22613: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b5 46400 1727204527.22754: no more pending results, returning what we have 46400 1727204527.22762: results queue empty 46400 1727204527.22765: checking for any_errors_fatal 46400 1727204527.22773: done checking for any_errors_fatal 46400 1727204527.22774: checking for max_fail_percentage 46400 1727204527.22776: done checking for max_fail_percentage 46400 1727204527.22777: checking to see if all hosts have failed and the running result is not ok 46400 1727204527.22778: done checking to see if all hosts have failed 46400 1727204527.22779: getting the remaining hosts for this loop 46400 1727204527.22781: done getting the remaining hosts for this loop 46400 1727204527.22785: getting the next task for host managed-node2 46400 1727204527.22793: done getting next task for host managed-node2 46400 1727204527.22796: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204527.22799: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204527.22802: getting variables 46400 1727204527.22804: in VariableManager get_vars() 46400 1727204527.22841: Calling all_inventory to load vars for managed-node2 46400 1727204527.22844: Calling groups_inventory to load vars for managed-node2 46400 1727204527.22849: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.22866: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.22869: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.22872: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.23950: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b5 46400 1727204527.23953: WORKER PROCESS EXITING 46400 1727204527.25670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.28831: done with get_vars() 46400 1727204527.28868: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.205) 0:00:17.574 ***** 46400 1727204527.28999: entering _queue_task() for managed-node2/include_tasks 46400 1727204527.29441: worker is 1 (out of 1 available) 46400 1727204527.29454: exiting _queue_task() for managed-node2/include_tasks 46400 1727204527.29478: done queuing things up, now waiting for results queue to drain 46400 1727204527.29480: waiting for pending results... 46400 1727204527.29806: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204527.29941: in run() - task 0affcd87-79f5-1303-fda8-0000000005b6 46400 1727204527.29965: variable 'ansible_search_path' from source: unknown 46400 1727204527.29976: variable 'ansible_search_path' from source: unknown 46400 1727204527.30019: calling self._execute() 46400 1727204527.30126: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.30138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.30168: variable 'omit' from source: magic vars 46400 1727204527.30597: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.30638: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.30656: _execute() done 46400 1727204527.30670: dumping result to json 46400 1727204527.31478: done dumping result, returning 46400 1727204527.31488: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-0000000005b6] 46400 1727204527.31527: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b6 46400 1727204527.31659: no more pending results, returning what we have 46400 1727204527.31669: in VariableManager get_vars() 46400 1727204527.31708: Calling all_inventory to load vars for managed-node2 46400 1727204527.31712: Calling groups_inventory to load vars for managed-node2 46400 1727204527.31717: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.31731: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.31734: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.31738: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.33158: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b6 46400 1727204527.33166: WORKER PROCESS EXITING 46400 1727204527.35247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.37445: done with get_vars() 46400 1727204527.37467: variable 'ansible_search_path' from source: unknown 46400 1727204527.37468: variable 'ansible_search_path' from source: unknown 46400 1727204527.37505: we have included files to process 46400 1727204527.37506: generating all_blocks data 46400 1727204527.37508: done generating all_blocks data 46400 1727204527.37551: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204527.37553: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204527.37558: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204527.37808: in VariableManager get_vars() 46400 1727204527.38112: done with get_vars() 46400 1727204527.38357: done processing included file 46400 1727204527.38358: iterating over new_blocks loaded from include file 46400 1727204527.38362: in VariableManager get_vars() 46400 1727204527.38384: done with get_vars() 46400 1727204527.38386: filtering new block on tags 46400 1727204527.38444: done filtering new block on tags 46400 1727204527.38447: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204527.38453: extending task lists for all hosts with included blocks 46400 1727204527.39007: done extending task lists 46400 1727204527.39008: done processing included files 46400 1727204527.39009: results queue empty 46400 1727204527.39009: checking for any_errors_fatal 46400 1727204527.39016: done checking for any_errors_fatal 46400 1727204527.39017: checking for max_fail_percentage 46400 1727204527.39018: done checking for max_fail_percentage 46400 1727204527.39019: checking to see if all hosts have failed and the running result is not ok 46400 1727204527.39020: done checking to see if all hosts have failed 46400 1727204527.39021: getting the remaining hosts for this loop 46400 1727204527.39022: done getting the remaining hosts for this loop 46400 1727204527.39025: getting the next task for host managed-node2 46400 1727204527.39037: done getting next task for host managed-node2 46400 1727204527.39039: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204527.39042: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204527.39045: getting variables 46400 1727204527.39046: in VariableManager get_vars() 46400 1727204527.39056: Calling all_inventory to load vars for managed-node2 46400 1727204527.39058: Calling groups_inventory to load vars for managed-node2 46400 1727204527.39065: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.39072: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.39074: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.39077: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.41330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.43268: done with get_vars() 46400 1727204527.43294: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.144) 0:00:17.719 ***** 46400 1727204527.43465: entering _queue_task() for managed-node2/include_tasks 46400 1727204527.43920: worker is 1 (out of 1 available) 46400 1727204527.43935: exiting _queue_task() for managed-node2/include_tasks 46400 1727204527.43949: done queuing things up, now waiting for results queue to drain 46400 1727204527.43950: waiting for pending results... 46400 1727204527.44329: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204527.44466: in run() - task 0affcd87-79f5-1303-fda8-0000000005dd 46400 1727204527.44486: variable 'ansible_search_path' from source: unknown 46400 1727204527.44493: variable 'ansible_search_path' from source: unknown 46400 1727204527.44544: calling self._execute() 46400 1727204527.44652: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.44666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.44685: variable 'omit' from source: magic vars 46400 1727204527.45098: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.45121: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.45133: _execute() done 46400 1727204527.45141: dumping result to json 46400 1727204527.45149: done dumping result, returning 46400 1727204527.45167: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-0000000005dd] 46400 1727204527.45179: sending task result for task 0affcd87-79f5-1303-fda8-0000000005dd 46400 1727204527.45315: no more pending results, returning what we have 46400 1727204527.45321: in VariableManager get_vars() 46400 1727204527.45368: Calling all_inventory to load vars for managed-node2 46400 1727204527.45371: Calling groups_inventory to load vars for managed-node2 46400 1727204527.45375: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.45389: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.45394: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.45397: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.46586: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005dd 46400 1727204527.46590: WORKER PROCESS EXITING 46400 1727204527.47493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.49624: done with get_vars() 46400 1727204527.49669: variable 'ansible_search_path' from source: unknown 46400 1727204527.49673: variable 'ansible_search_path' from source: unknown 46400 1727204527.49719: we have included files to process 46400 1727204527.49721: generating all_blocks data 46400 1727204527.49723: done generating all_blocks data 46400 1727204527.49724: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204527.49725: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204527.49728: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204527.50093: done processing included file 46400 1727204527.50095: iterating over new_blocks loaded from include file 46400 1727204527.50097: in VariableManager get_vars() 46400 1727204527.50113: done with get_vars() 46400 1727204527.50115: filtering new block on tags 46400 1727204527.50298: done filtering new block on tags 46400 1727204527.50300: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204527.50306: extending task lists for all hosts with included blocks 46400 1727204527.50554: done extending task lists 46400 1727204527.50555: done processing included files 46400 1727204527.50556: results queue empty 46400 1727204527.50557: checking for any_errors_fatal 46400 1727204527.50560: done checking for any_errors_fatal 46400 1727204527.50561: checking for max_fail_percentage 46400 1727204527.50563: done checking for max_fail_percentage 46400 1727204527.50566: checking to see if all hosts have failed and the running result is not ok 46400 1727204527.50567: done checking to see if all hosts have failed 46400 1727204527.50568: getting the remaining hosts for this loop 46400 1727204527.50569: done getting the remaining hosts for this loop 46400 1727204527.50572: getting the next task for host managed-node2 46400 1727204527.50577: done getting next task for host managed-node2 46400 1727204527.50580: ^ task is: TASK: Gather current interface info 46400 1727204527.50583: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204527.50585: getting variables 46400 1727204527.50586: in VariableManager get_vars() 46400 1727204527.50600: Calling all_inventory to load vars for managed-node2 46400 1727204527.50603: Calling groups_inventory to load vars for managed-node2 46400 1727204527.50605: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.50611: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.50614: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.50616: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.51925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.54919: done with get_vars() 46400 1727204527.54958: done getting variables 46400 1727204527.55011: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.115) 0:00:17.835 ***** 46400 1727204527.55047: entering _queue_task() for managed-node2/command 46400 1727204527.55396: worker is 1 (out of 1 available) 46400 1727204527.55409: exiting _queue_task() for managed-node2/command 46400 1727204527.55423: done queuing things up, now waiting for results queue to drain 46400 1727204527.55424: waiting for pending results... 46400 1727204527.55722: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204527.55878: in run() - task 0affcd87-79f5-1303-fda8-000000000618 46400 1727204527.55898: variable 'ansible_search_path' from source: unknown 46400 1727204527.55905: variable 'ansible_search_path' from source: unknown 46400 1727204527.55940: calling self._execute() 46400 1727204527.56027: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.56039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.56059: variable 'omit' from source: magic vars 46400 1727204527.56480: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.56503: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.56519: variable 'omit' from source: magic vars 46400 1727204527.56578: variable 'omit' from source: magic vars 46400 1727204527.56625: variable 'omit' from source: magic vars 46400 1727204527.56675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204527.56722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204527.56755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204527.56781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.56797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204527.56841: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204527.56850: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.56858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.56970: Set connection var ansible_shell_type to sh 46400 1727204527.56986: Set connection var ansible_shell_executable to /bin/sh 46400 1727204527.56997: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204527.57007: Set connection var ansible_connection to ssh 46400 1727204527.57016: Set connection var ansible_pipelining to False 46400 1727204527.57026: Set connection var ansible_timeout to 10 46400 1727204527.57066: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.57075: variable 'ansible_connection' from source: unknown 46400 1727204527.57083: variable 'ansible_module_compression' from source: unknown 46400 1727204527.57090: variable 'ansible_shell_type' from source: unknown 46400 1727204527.57097: variable 'ansible_shell_executable' from source: unknown 46400 1727204527.57103: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.57110: variable 'ansible_pipelining' from source: unknown 46400 1727204527.57117: variable 'ansible_timeout' from source: unknown 46400 1727204527.57124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.57290: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204527.57308: variable 'omit' from source: magic vars 46400 1727204527.57317: starting attempt loop 46400 1727204527.57323: running the handler 46400 1727204527.57343: _low_level_execute_command(): starting 46400 1727204527.57361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204527.58186: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.58203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.58262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.58286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.58326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.58369: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.58385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.58402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.58414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.58425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.58440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.58455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.58482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.58583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.58595: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.58609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.58697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.58722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.58740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.58821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.60480: stdout chunk (state=3): >>>/root <<< 46400 1727204527.60684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204527.60687: stdout chunk (state=3): >>><<< 46400 1727204527.60690: stderr chunk (state=3): >>><<< 46400 1727204527.60811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204527.60815: _low_level_execute_command(): starting 46400 1727204527.60819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550 `" && echo ansible-tmp-1727204527.6071086-47560-126182866077550="` echo /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550 `" ) && sleep 0' 46400 1727204527.61757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.61805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.61822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.61841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.61940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.61953: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.61970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.61988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.62003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.62019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.62031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.62046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.62066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.62133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.62146: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.62161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.62353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.62381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.62399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.62481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.64388: stdout chunk (state=3): >>>ansible-tmp-1727204527.6071086-47560-126182866077550=/root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550 <<< 46400 1727204527.64591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204527.64628: stderr chunk (state=3): >>><<< 46400 1727204527.64631: stdout chunk (state=3): >>><<< 46400 1727204527.64876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204527.6071086-47560-126182866077550=/root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204527.64879: variable 'ansible_module_compression' from source: unknown 46400 1727204527.64881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204527.64883: variable 'ansible_facts' from source: unknown 46400 1727204527.64897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/AnsiballZ_command.py 46400 1727204527.65268: Sending initial data 46400 1727204527.65272: Sent initial data (156 bytes) 46400 1727204527.66245: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.66255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.66267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.66282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.66322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.66330: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.66340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.66353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.66365: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.66371: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.66380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.66389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.66402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.66409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.66416: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.66424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.66499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.66526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.66530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.66608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.68337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204527.68374: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204527.68413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp7iupxjee /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/AnsiballZ_command.py <<< 46400 1727204527.68451: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204527.69695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204527.69870: stderr chunk (state=3): >>><<< 46400 1727204527.69873: stdout chunk (state=3): >>><<< 46400 1727204527.69876: done transferring module to remote 46400 1727204527.69882: _low_level_execute_command(): starting 46400 1727204527.69885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/ /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/AnsiballZ_command.py && sleep 0' 46400 1727204527.70579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.70597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.70608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.70622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.70665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.70671: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.70685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.70703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.70711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.70721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.70724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.70737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.70744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.70751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.70767: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.70770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.70902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.70906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.70914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.71055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.72780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204527.72784: stdout chunk (state=3): >>><<< 46400 1727204527.72792: stderr chunk (state=3): >>><<< 46400 1727204527.72816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204527.72820: _low_level_execute_command(): starting 46400 1727204527.72825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/AnsiballZ_command.py && sleep 0' 46400 1727204527.73519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.73528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.73539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.73553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.73619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.73623: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.73625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.73641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.73648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.73655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.73670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.73681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.73694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.73701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.73707: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.73717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.73797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.73812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.73815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.73915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.87451: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:07.870047", "end": "2024-09-24 15:02:07.873513", "delta": "0:00:00.003466", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204527.89297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204527.89301: stdout chunk (state=3): >>><<< 46400 1727204527.89303: stderr chunk (state=3): >>><<< 46400 1727204527.89473: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:07.870047", "end": "2024-09-24 15:02:07.873513", "delta": "0:00:00.003466", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204527.89482: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204527.89485: _low_level_execute_command(): starting 46400 1727204527.89487: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204527.6071086-47560-126182866077550/ > /dev/null 2>&1 && sleep 0' 46400 1727204527.90295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204527.90309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.90324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.90349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.90396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.90409: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204527.90424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.90443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204527.90465: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204527.90479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204527.90491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204527.90504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204527.90518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204527.90531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204527.90542: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204527.90554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204527.90649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204527.90670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204527.90693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204527.90910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204527.92805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204527.92809: stdout chunk (state=3): >>><<< 46400 1727204527.92811: stderr chunk (state=3): >>><<< 46400 1727204527.93074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204527.93078: handler run complete 46400 1727204527.93081: Evaluated conditional (False): False 46400 1727204527.93083: attempt loop complete, returning result 46400 1727204527.93085: _execute() done 46400 1727204527.93086: dumping result to json 46400 1727204527.93088: done dumping result, returning 46400 1727204527.93090: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-000000000618] 46400 1727204527.93092: sending task result for task 0affcd87-79f5-1303-fda8-000000000618 46400 1727204527.93170: done sending task result for task 0affcd87-79f5-1303-fda8-000000000618 46400 1727204527.93174: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003466", "end": "2024-09-24 15:02:07.873513", "rc": 0, "start": "2024-09-24 15:02:07.870047" } STDOUT: bonding_masters eth0 lo 46400 1727204527.93258: no more pending results, returning what we have 46400 1727204527.93276: results queue empty 46400 1727204527.93278: checking for any_errors_fatal 46400 1727204527.93280: done checking for any_errors_fatal 46400 1727204527.93281: checking for max_fail_percentage 46400 1727204527.93283: done checking for max_fail_percentage 46400 1727204527.93284: checking to see if all hosts have failed and the running result is not ok 46400 1727204527.93285: done checking to see if all hosts have failed 46400 1727204527.93286: getting the remaining hosts for this loop 46400 1727204527.93288: done getting the remaining hosts for this loop 46400 1727204527.93292: getting the next task for host managed-node2 46400 1727204527.93302: done getting next task for host managed-node2 46400 1727204527.93305: ^ task is: TASK: Set current_interfaces 46400 1727204527.93311: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204527.93316: getting variables 46400 1727204527.93318: in VariableManager get_vars() 46400 1727204527.93354: Calling all_inventory to load vars for managed-node2 46400 1727204527.93357: Calling groups_inventory to load vars for managed-node2 46400 1727204527.93365: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204527.93379: Calling all_plugins_play to load vars for managed-node2 46400 1727204527.93382: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204527.93389: Calling groups_plugins_play to load vars for managed-node2 46400 1727204527.96320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204527.98142: done with get_vars() 46400 1727204527.98178: done getting variables 46400 1727204527.98240: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.432) 0:00:18.267 ***** 46400 1727204527.98284: entering _queue_task() for managed-node2/set_fact 46400 1727204527.98612: worker is 1 (out of 1 available) 46400 1727204527.98625: exiting _queue_task() for managed-node2/set_fact 46400 1727204527.98639: done queuing things up, now waiting for results queue to drain 46400 1727204527.98641: waiting for pending results... 46400 1727204527.98948: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204527.99100: in run() - task 0affcd87-79f5-1303-fda8-000000000619 46400 1727204527.99117: variable 'ansible_search_path' from source: unknown 46400 1727204527.99123: variable 'ansible_search_path' from source: unknown 46400 1727204527.99169: calling self._execute() 46400 1727204527.99262: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204527.99276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204527.99288: variable 'omit' from source: magic vars 46400 1727204527.99644: variable 'ansible_distribution_major_version' from source: facts 46400 1727204527.99659: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204527.99675: variable 'omit' from source: magic vars 46400 1727204527.99729: variable 'omit' from source: magic vars 46400 1727204527.99841: variable '_current_interfaces' from source: set_fact 46400 1727204527.99931: variable 'omit' from source: magic vars 46400 1727204527.99989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204528.00055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204528.00085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204528.00109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.00137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.00188: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204528.00197: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.00204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.00353: Set connection var ansible_shell_type to sh 46400 1727204528.00371: Set connection var ansible_shell_executable to /bin/sh 46400 1727204528.00397: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204528.00409: Set connection var ansible_connection to ssh 46400 1727204528.00418: Set connection var ansible_pipelining to False 46400 1727204528.00436: Set connection var ansible_timeout to 10 46400 1727204528.00488: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.00499: variable 'ansible_connection' from source: unknown 46400 1727204528.00507: variable 'ansible_module_compression' from source: unknown 46400 1727204528.00521: variable 'ansible_shell_type' from source: unknown 46400 1727204528.00535: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.00543: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.00559: variable 'ansible_pipelining' from source: unknown 46400 1727204528.00570: variable 'ansible_timeout' from source: unknown 46400 1727204528.00579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.00753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204528.00777: variable 'omit' from source: magic vars 46400 1727204528.00796: starting attempt loop 46400 1727204528.00804: running the handler 46400 1727204528.00824: handler run complete 46400 1727204528.00840: attempt loop complete, returning result 46400 1727204528.00854: _execute() done 46400 1727204528.00861: dumping result to json 46400 1727204528.00872: done dumping result, returning 46400 1727204528.00890: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-000000000619] 46400 1727204528.00901: sending task result for task 0affcd87-79f5-1303-fda8-000000000619 46400 1727204528.01018: done sending task result for task 0affcd87-79f5-1303-fda8-000000000619 46400 1727204528.01026: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204528.01092: no more pending results, returning what we have 46400 1727204528.01096: results queue empty 46400 1727204528.01098: checking for any_errors_fatal 46400 1727204528.01107: done checking for any_errors_fatal 46400 1727204528.01108: checking for max_fail_percentage 46400 1727204528.01110: done checking for max_fail_percentage 46400 1727204528.01113: checking to see if all hosts have failed and the running result is not ok 46400 1727204528.01114: done checking to see if all hosts have failed 46400 1727204528.01115: getting the remaining hosts for this loop 46400 1727204528.01117: done getting the remaining hosts for this loop 46400 1727204528.01121: getting the next task for host managed-node2 46400 1727204528.01131: done getting next task for host managed-node2 46400 1727204528.01134: ^ task is: TASK: Show current_interfaces 46400 1727204528.01138: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204528.01143: getting variables 46400 1727204528.01145: in VariableManager get_vars() 46400 1727204528.01181: Calling all_inventory to load vars for managed-node2 46400 1727204528.01184: Calling groups_inventory to load vars for managed-node2 46400 1727204528.01188: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.01199: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.01202: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.01205: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.03148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.05023: done with get_vars() 46400 1727204528.05045: done getting variables 46400 1727204528.05109: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.068) 0:00:18.335 ***** 46400 1727204528.05141: entering _queue_task() for managed-node2/debug 46400 1727204528.05455: worker is 1 (out of 1 available) 46400 1727204528.05473: exiting _queue_task() for managed-node2/debug 46400 1727204528.05486: done queuing things up, now waiting for results queue to drain 46400 1727204528.05488: waiting for pending results... 46400 1727204528.05772: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204528.05885: in run() - task 0affcd87-79f5-1303-fda8-0000000005de 46400 1727204528.05898: variable 'ansible_search_path' from source: unknown 46400 1727204528.05901: variable 'ansible_search_path' from source: unknown 46400 1727204528.05941: calling self._execute() 46400 1727204528.06029: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.06036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.06050: variable 'omit' from source: magic vars 46400 1727204528.07444: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.07453: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.07463: variable 'omit' from source: magic vars 46400 1727204528.07969: variable 'omit' from source: magic vars 46400 1727204528.07972: variable 'current_interfaces' from source: set_fact 46400 1727204528.07974: variable 'omit' from source: magic vars 46400 1727204528.08036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204528.08143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204528.08170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204528.08189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.08200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.08231: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204528.08348: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.08353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.08448: Set connection var ansible_shell_type to sh 46400 1727204528.08575: Set connection var ansible_shell_executable to /bin/sh 46400 1727204528.08580: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204528.08585: Set connection var ansible_connection to ssh 46400 1727204528.08590: Set connection var ansible_pipelining to False 46400 1727204528.08596: Set connection var ansible_timeout to 10 46400 1727204528.08621: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.08626: variable 'ansible_connection' from source: unknown 46400 1727204528.08629: variable 'ansible_module_compression' from source: unknown 46400 1727204528.08631: variable 'ansible_shell_type' from source: unknown 46400 1727204528.08633: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.08635: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.08637: variable 'ansible_pipelining' from source: unknown 46400 1727204528.08639: variable 'ansible_timeout' from source: unknown 46400 1727204528.08641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.09026: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204528.09036: variable 'omit' from source: magic vars 46400 1727204528.09042: starting attempt loop 46400 1727204528.09044: running the handler 46400 1727204528.09094: handler run complete 46400 1727204528.09221: attempt loop complete, returning result 46400 1727204528.09224: _execute() done 46400 1727204528.09227: dumping result to json 46400 1727204528.09229: done dumping result, returning 46400 1727204528.09236: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-0000000005de] 46400 1727204528.09242: sending task result for task 0affcd87-79f5-1303-fda8-0000000005de 46400 1727204528.09341: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005de 46400 1727204528.09345: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204528.09400: no more pending results, returning what we have 46400 1727204528.09404: results queue empty 46400 1727204528.09405: checking for any_errors_fatal 46400 1727204528.09409: done checking for any_errors_fatal 46400 1727204528.09410: checking for max_fail_percentage 46400 1727204528.09412: done checking for max_fail_percentage 46400 1727204528.09413: checking to see if all hosts have failed and the running result is not ok 46400 1727204528.09414: done checking to see if all hosts have failed 46400 1727204528.09415: getting the remaining hosts for this loop 46400 1727204528.09416: done getting the remaining hosts for this loop 46400 1727204528.09421: getting the next task for host managed-node2 46400 1727204528.09432: done getting next task for host managed-node2 46400 1727204528.09436: ^ task is: TASK: Setup 46400 1727204528.09439: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204528.09444: getting variables 46400 1727204528.09446: in VariableManager get_vars() 46400 1727204528.09486: Calling all_inventory to load vars for managed-node2 46400 1727204528.09489: Calling groups_inventory to load vars for managed-node2 46400 1727204528.09493: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.09506: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.09508: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.09511: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.11200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.12854: done with get_vars() 46400 1727204528.12888: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.078) 0:00:18.414 ***** 46400 1727204528.12995: entering _queue_task() for managed-node2/include_tasks 46400 1727204528.13332: worker is 1 (out of 1 available) 46400 1727204528.13345: exiting _queue_task() for managed-node2/include_tasks 46400 1727204528.13359: done queuing things up, now waiting for results queue to drain 46400 1727204528.13368: waiting for pending results... 46400 1727204528.13667: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204528.13760: in run() - task 0affcd87-79f5-1303-fda8-0000000005b7 46400 1727204528.13780: variable 'ansible_search_path' from source: unknown 46400 1727204528.13783: variable 'ansible_search_path' from source: unknown 46400 1727204528.13828: variable 'lsr_setup' from source: include params 46400 1727204528.14039: variable 'lsr_setup' from source: include params 46400 1727204528.14111: variable 'omit' from source: magic vars 46400 1727204528.14232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.14244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.14256: variable 'omit' from source: magic vars 46400 1727204528.14496: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.14510: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.14517: variable 'item' from source: unknown 46400 1727204528.14588: variable 'item' from source: unknown 46400 1727204528.14619: variable 'item' from source: unknown 46400 1727204528.14683: variable 'item' from source: unknown 46400 1727204528.14804: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.14808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.14810: variable 'omit' from source: magic vars 46400 1727204528.14952: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.14957: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.14967: variable 'item' from source: unknown 46400 1727204528.15032: variable 'item' from source: unknown 46400 1727204528.15061: variable 'item' from source: unknown 46400 1727204528.15122: variable 'item' from source: unknown 46400 1727204528.15195: dumping result to json 46400 1727204528.15198: done dumping result, returning 46400 1727204528.15200: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-0000000005b7] 46400 1727204528.15202: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b7 46400 1727204528.15235: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b7 46400 1727204528.15239: WORKER PROCESS EXITING 46400 1727204528.15269: no more pending results, returning what we have 46400 1727204528.15275: in VariableManager get_vars() 46400 1727204528.15315: Calling all_inventory to load vars for managed-node2 46400 1727204528.15318: Calling groups_inventory to load vars for managed-node2 46400 1727204528.15322: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.15337: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.15340: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.15343: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.17204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.18806: done with get_vars() 46400 1727204528.18832: variable 'ansible_search_path' from source: unknown 46400 1727204528.18834: variable 'ansible_search_path' from source: unknown 46400 1727204528.18886: variable 'ansible_search_path' from source: unknown 46400 1727204528.18887: variable 'ansible_search_path' from source: unknown 46400 1727204528.18919: we have included files to process 46400 1727204528.18920: generating all_blocks data 46400 1727204528.18922: done generating all_blocks data 46400 1727204528.18926: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204528.18927: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204528.18929: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 46400 1727204528.19124: done processing included file 46400 1727204528.19126: iterating over new_blocks loaded from include file 46400 1727204528.19127: in VariableManager get_vars() 46400 1727204528.19143: done with get_vars() 46400 1727204528.19145: filtering new block on tags 46400 1727204528.19176: done filtering new block on tags 46400 1727204528.19178: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 => (item=tasks/delete_interface.yml) 46400 1727204528.19183: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204528.19185: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204528.19188: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204528.19279: in VariableManager get_vars() 46400 1727204528.19298: done with get_vars() 46400 1727204528.19392: done processing included file 46400 1727204528.19394: iterating over new_blocks loaded from include file 46400 1727204528.19395: in VariableManager get_vars() 46400 1727204528.19408: done with get_vars() 46400 1727204528.19410: filtering new block on tags 46400 1727204528.19442: done filtering new block on tags 46400 1727204528.19445: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 46400 1727204528.19448: extending task lists for all hosts with included blocks 46400 1727204528.20099: done extending task lists 46400 1727204528.20100: done processing included files 46400 1727204528.20101: results queue empty 46400 1727204528.20102: checking for any_errors_fatal 46400 1727204528.20106: done checking for any_errors_fatal 46400 1727204528.20106: checking for max_fail_percentage 46400 1727204528.20107: done checking for max_fail_percentage 46400 1727204528.20108: checking to see if all hosts have failed and the running result is not ok 46400 1727204528.20109: done checking to see if all hosts have failed 46400 1727204528.20110: getting the remaining hosts for this loop 46400 1727204528.20111: done getting the remaining hosts for this loop 46400 1727204528.20113: getting the next task for host managed-node2 46400 1727204528.20118: done getting next task for host managed-node2 46400 1727204528.20120: ^ task is: TASK: Remove test interface if necessary 46400 1727204528.20123: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204528.20125: getting variables 46400 1727204528.20126: in VariableManager get_vars() 46400 1727204528.20140: Calling all_inventory to load vars for managed-node2 46400 1727204528.20142: Calling groups_inventory to load vars for managed-node2 46400 1727204528.20145: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.20150: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.20152: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.20155: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.21423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.23053: done with get_vars() 46400 1727204528.23079: done getting variables 46400 1727204528.23125: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.101) 0:00:18.516 ***** 46400 1727204528.23156: entering _queue_task() for managed-node2/command 46400 1727204528.23501: worker is 1 (out of 1 available) 46400 1727204528.23513: exiting _queue_task() for managed-node2/command 46400 1727204528.23527: done queuing things up, now waiting for results queue to drain 46400 1727204528.23529: waiting for pending results... 46400 1727204528.23857: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 46400 1727204528.23933: in run() - task 0affcd87-79f5-1303-fda8-00000000063e 46400 1727204528.23944: variable 'ansible_search_path' from source: unknown 46400 1727204528.23948: variable 'ansible_search_path' from source: unknown 46400 1727204528.23989: calling self._execute() 46400 1727204528.24077: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.24088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.24097: variable 'omit' from source: magic vars 46400 1727204528.24470: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.24482: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.24489: variable 'omit' from source: magic vars 46400 1727204528.24542: variable 'omit' from source: magic vars 46400 1727204528.24643: variable 'interface' from source: play vars 46400 1727204528.24660: variable 'omit' from source: magic vars 46400 1727204528.24704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204528.24743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204528.24766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204528.24784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.24795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.24823: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204528.24826: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.24829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.24930: Set connection var ansible_shell_type to sh 46400 1727204528.24940: Set connection var ansible_shell_executable to /bin/sh 46400 1727204528.24946: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204528.24956: Set connection var ansible_connection to ssh 46400 1727204528.24965: Set connection var ansible_pipelining to False 46400 1727204528.24973: Set connection var ansible_timeout to 10 46400 1727204528.24997: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.25000: variable 'ansible_connection' from source: unknown 46400 1727204528.25002: variable 'ansible_module_compression' from source: unknown 46400 1727204528.25005: variable 'ansible_shell_type' from source: unknown 46400 1727204528.25007: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.25009: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.25013: variable 'ansible_pipelining' from source: unknown 46400 1727204528.25015: variable 'ansible_timeout' from source: unknown 46400 1727204528.25020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.25158: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204528.25175: variable 'omit' from source: magic vars 46400 1727204528.25180: starting attempt loop 46400 1727204528.25183: running the handler 46400 1727204528.25201: _low_level_execute_command(): starting 46400 1727204528.25209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204528.26004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.26017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.26029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.26049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.26095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.26102: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.26112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.26126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.26134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.26143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.26152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.26173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.26185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.26193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.26200: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.26210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.26291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.26312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.26324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.26408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.28092: stdout chunk (state=3): >>>/root <<< 46400 1727204528.28254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.28262: stdout chunk (state=3): >>><<< 46400 1727204528.28271: stderr chunk (state=3): >>><<< 46400 1727204528.28294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.28308: _low_level_execute_command(): starting 46400 1727204528.28314: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819 `" && echo ansible-tmp-1727204528.2829406-47648-156493521866819="` echo /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819 `" ) && sleep 0' 46400 1727204528.28911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.28921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.28934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.28946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.28988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.28996: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.29003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.29017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.29023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.29030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.29038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.29046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.29057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.29066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.29074: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.29083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.29163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.29174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.29177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.29247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.31159: stdout chunk (state=3): >>>ansible-tmp-1727204528.2829406-47648-156493521866819=/root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819 <<< 46400 1727204528.31270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.31359: stderr chunk (state=3): >>><<< 46400 1727204528.31366: stdout chunk (state=3): >>><<< 46400 1727204528.31386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204528.2829406-47648-156493521866819=/root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.31419: variable 'ansible_module_compression' from source: unknown 46400 1727204528.31478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204528.31512: variable 'ansible_facts' from source: unknown 46400 1727204528.31602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/AnsiballZ_command.py 46400 1727204528.31750: Sending initial data 46400 1727204528.31753: Sent initial data (156 bytes) 46400 1727204528.32713: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.32722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.32732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.32746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.32787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.32793: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.32804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.32817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.32824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.32830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.32839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.32849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.32865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.32876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.32884: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.32893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.32967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.32982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.32985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.33054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.34822: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204528.34854: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204528.34895: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpaqpvkv18 /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/AnsiballZ_command.py <<< 46400 1727204528.34935: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204528.35911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.36017: stderr chunk (state=3): >>><<< 46400 1727204528.36020: stdout chunk (state=3): >>><<< 46400 1727204528.36036: done transferring module to remote 46400 1727204528.36047: _low_level_execute_command(): starting 46400 1727204528.36052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/ /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/AnsiballZ_command.py && sleep 0' 46400 1727204528.36498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.36504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.36535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.36541: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.36550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.36567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.36571: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.36580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.36586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.36639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.36643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.36651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.36704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.38449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.38505: stderr chunk (state=3): >>><<< 46400 1727204528.38507: stdout chunk (state=3): >>><<< 46400 1727204528.38571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.38575: _low_level_execute_command(): starting 46400 1727204528.38578: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/AnsiballZ_command.py && sleep 0' 46400 1727204528.38951: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.38957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.39002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.39006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.39009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.39074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.39077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.39118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.52991: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:02:08.522885", "end": "2024-09-24 15:02:08.528974", "delta": "0:00:00.006089", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204528.54110: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204528.54170: stderr chunk (state=3): >>><<< 46400 1727204528.54173: stdout chunk (state=3): >>><<< 46400 1727204528.54192: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:02:08.522885", "end": "2024-09-24 15:02:08.528974", "delta": "0:00:00.006089", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204528.54222: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204528.54231: _low_level_execute_command(): starting 46400 1727204528.54237: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204528.2829406-47648-156493521866819/ > /dev/null 2>&1 && sleep 0' 46400 1727204528.54720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.54727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.54780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.54784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.54787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.54835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.54848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.54900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.56686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.56738: stderr chunk (state=3): >>><<< 46400 1727204528.56741: stdout chunk (state=3): >>><<< 46400 1727204528.56755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.56760: handler run complete 46400 1727204528.56784: Evaluated conditional (False): False 46400 1727204528.56792: attempt loop complete, returning result 46400 1727204528.56795: _execute() done 46400 1727204528.56797: dumping result to json 46400 1727204528.56802: done dumping result, returning 46400 1727204528.56810: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [0affcd87-79f5-1303-fda8-00000000063e] 46400 1727204528.56815: sending task result for task 0affcd87-79f5-1303-fda8-00000000063e 46400 1727204528.56913: done sending task result for task 0affcd87-79f5-1303-fda8-00000000063e 46400 1727204528.56916: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006089", "end": "2024-09-24 15:02:08.528974", "rc": 1, "start": "2024-09-24 15:02:08.522885" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204528.56992: no more pending results, returning what we have 46400 1727204528.56997: results queue empty 46400 1727204528.56998: checking for any_errors_fatal 46400 1727204528.56999: done checking for any_errors_fatal 46400 1727204528.56999: checking for max_fail_percentage 46400 1727204528.57001: done checking for max_fail_percentage 46400 1727204528.57002: checking to see if all hosts have failed and the running result is not ok 46400 1727204528.57003: done checking to see if all hosts have failed 46400 1727204528.57004: getting the remaining hosts for this loop 46400 1727204528.57006: done getting the remaining hosts for this loop 46400 1727204528.57009: getting the next task for host managed-node2 46400 1727204528.57019: done getting next task for host managed-node2 46400 1727204528.57022: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204528.57025: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204528.57030: getting variables 46400 1727204528.57031: in VariableManager get_vars() 46400 1727204528.57061: Calling all_inventory to load vars for managed-node2 46400 1727204528.57065: Calling groups_inventory to load vars for managed-node2 46400 1727204528.57069: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.57080: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.57082: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.57084: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.57955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.58853: done with get_vars() 46400 1727204528.58871: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.357) 0:00:18.874 ***** 46400 1727204528.58942: entering _queue_task() for managed-node2/include_tasks 46400 1727204528.59165: worker is 1 (out of 1 available) 46400 1727204528.59180: exiting _queue_task() for managed-node2/include_tasks 46400 1727204528.59194: done queuing things up, now waiting for results queue to drain 46400 1727204528.59195: waiting for pending results... 46400 1727204528.59370: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204528.59450: in run() - task 0affcd87-79f5-1303-fda8-000000000642 46400 1727204528.59460: variable 'ansible_search_path' from source: unknown 46400 1727204528.59467: variable 'ansible_search_path' from source: unknown 46400 1727204528.59497: calling self._execute() 46400 1727204528.59567: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.59573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.59581: variable 'omit' from source: magic vars 46400 1727204528.59858: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.59875: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.59879: _execute() done 46400 1727204528.59882: dumping result to json 46400 1727204528.59884: done dumping result, returning 46400 1727204528.59891: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-000000000642] 46400 1727204528.59897: sending task result for task 0affcd87-79f5-1303-fda8-000000000642 46400 1727204528.59981: done sending task result for task 0affcd87-79f5-1303-fda8-000000000642 46400 1727204528.59983: WORKER PROCESS EXITING 46400 1727204528.60014: no more pending results, returning what we have 46400 1727204528.60019: in VariableManager get_vars() 46400 1727204528.60057: Calling all_inventory to load vars for managed-node2 46400 1727204528.60060: Calling groups_inventory to load vars for managed-node2 46400 1727204528.60065: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.60078: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.60081: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.60084: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.60872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.65518: done with get_vars() 46400 1727204528.65542: variable 'ansible_search_path' from source: unknown 46400 1727204528.65543: variable 'ansible_search_path' from source: unknown 46400 1727204528.65553: variable 'item' from source: include params 46400 1727204528.65647: variable 'item' from source: include params 46400 1727204528.65682: we have included files to process 46400 1727204528.65683: generating all_blocks data 46400 1727204528.65685: done generating all_blocks data 46400 1727204528.65687: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204528.65688: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204528.65690: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204528.65857: done processing included file 46400 1727204528.65859: iterating over new_blocks loaded from include file 46400 1727204528.65861: in VariableManager get_vars() 46400 1727204528.65879: done with get_vars() 46400 1727204528.65881: filtering new block on tags 46400 1727204528.65907: done filtering new block on tags 46400 1727204528.65910: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204528.65914: extending task lists for all hosts with included blocks 46400 1727204528.66074: done extending task lists 46400 1727204528.66075: done processing included files 46400 1727204528.66076: results queue empty 46400 1727204528.66077: checking for any_errors_fatal 46400 1727204528.66081: done checking for any_errors_fatal 46400 1727204528.66082: checking for max_fail_percentage 46400 1727204528.66083: done checking for max_fail_percentage 46400 1727204528.66084: checking to see if all hosts have failed and the running result is not ok 46400 1727204528.66084: done checking to see if all hosts have failed 46400 1727204528.66085: getting the remaining hosts for this loop 46400 1727204528.66086: done getting the remaining hosts for this loop 46400 1727204528.66088: getting the next task for host managed-node2 46400 1727204528.66092: done getting next task for host managed-node2 46400 1727204528.66094: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204528.66096: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204528.66098: getting variables 46400 1727204528.66099: in VariableManager get_vars() 46400 1727204528.66109: Calling all_inventory to load vars for managed-node2 46400 1727204528.66111: Calling groups_inventory to load vars for managed-node2 46400 1727204528.66113: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204528.66118: Calling all_plugins_play to load vars for managed-node2 46400 1727204528.66120: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204528.66123: Calling groups_plugins_play to load vars for managed-node2 46400 1727204528.67067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204528.67955: done with get_vars() 46400 1727204528.67973: done getting variables 46400 1727204528.68064: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.091) 0:00:18.965 ***** 46400 1727204528.68086: entering _queue_task() for managed-node2/stat 46400 1727204528.68319: worker is 1 (out of 1 available) 46400 1727204528.68332: exiting _queue_task() for managed-node2/stat 46400 1727204528.68346: done queuing things up, now waiting for results queue to drain 46400 1727204528.68349: waiting for pending results... 46400 1727204528.68532: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204528.68617: in run() - task 0affcd87-79f5-1303-fda8-000000000691 46400 1727204528.68627: variable 'ansible_search_path' from source: unknown 46400 1727204528.68630: variable 'ansible_search_path' from source: unknown 46400 1727204528.68657: calling self._execute() 46400 1727204528.68735: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.68738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.68746: variable 'omit' from source: magic vars 46400 1727204528.69021: variable 'ansible_distribution_major_version' from source: facts 46400 1727204528.69031: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204528.69037: variable 'omit' from source: magic vars 46400 1727204528.69081: variable 'omit' from source: magic vars 46400 1727204528.69147: variable 'interface' from source: play vars 46400 1727204528.69164: variable 'omit' from source: magic vars 46400 1727204528.69200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204528.69227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204528.69245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204528.69258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.69274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204528.69296: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204528.69300: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.69302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.69370: Set connection var ansible_shell_type to sh 46400 1727204528.69379: Set connection var ansible_shell_executable to /bin/sh 46400 1727204528.69385: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204528.69391: Set connection var ansible_connection to ssh 46400 1727204528.69394: Set connection var ansible_pipelining to False 46400 1727204528.69400: Set connection var ansible_timeout to 10 46400 1727204528.69418: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.69421: variable 'ansible_connection' from source: unknown 46400 1727204528.69424: variable 'ansible_module_compression' from source: unknown 46400 1727204528.69426: variable 'ansible_shell_type' from source: unknown 46400 1727204528.69428: variable 'ansible_shell_executable' from source: unknown 46400 1727204528.69430: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204528.69435: variable 'ansible_pipelining' from source: unknown 46400 1727204528.69438: variable 'ansible_timeout' from source: unknown 46400 1727204528.69440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204528.69592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204528.69601: variable 'omit' from source: magic vars 46400 1727204528.69605: starting attempt loop 46400 1727204528.69608: running the handler 46400 1727204528.69621: _low_level_execute_command(): starting 46400 1727204528.69627: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204528.70173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.70183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.70194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.70223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204528.70237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.70249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.70302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.70309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.70316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.70386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.72066: stdout chunk (state=3): >>>/root <<< 46400 1727204528.72143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.72220: stderr chunk (state=3): >>><<< 46400 1727204528.72224: stdout chunk (state=3): >>><<< 46400 1727204528.72256: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.72274: _low_level_execute_command(): starting 46400 1727204528.72280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004 `" && echo ansible-tmp-1727204528.7225628-47670-148373357579004="` echo /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004 `" ) && sleep 0' 46400 1727204528.72756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.72761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.72798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204528.72810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204528.72821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.72831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.72884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.72896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.72945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.74835: stdout chunk (state=3): >>>ansible-tmp-1727204528.7225628-47670-148373357579004=/root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004 <<< 46400 1727204528.74948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.75006: stderr chunk (state=3): >>><<< 46400 1727204528.75009: stdout chunk (state=3): >>><<< 46400 1727204528.75024: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204528.7225628-47670-148373357579004=/root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.75063: variable 'ansible_module_compression' from source: unknown 46400 1727204528.75118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204528.75149: variable 'ansible_facts' from source: unknown 46400 1727204528.75214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/AnsiballZ_stat.py 46400 1727204528.75333: Sending initial data 46400 1727204528.75336: Sent initial data (153 bytes) 46400 1727204528.76027: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.76033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.76087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204528.76091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204528.76093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.76096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.76152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.76155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.76158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.76201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.77953: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204528.77995: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204528.78022: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmplgbptz_d /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/AnsiballZ_stat.py <<< 46400 1727204528.78075: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204528.79186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.79285: stderr chunk (state=3): >>><<< 46400 1727204528.79289: stdout chunk (state=3): >>><<< 46400 1727204528.79313: done transferring module to remote 46400 1727204528.79323: _low_level_execute_command(): starting 46400 1727204528.79328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/ /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/AnsiballZ_stat.py && sleep 0' 46400 1727204528.79960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.79975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.79986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.80000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.80042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.80049: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.80059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.80079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.80086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.80092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.80099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.80108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.80119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.80127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.80135: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.80143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.80217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.80235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.80243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.80313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.82083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204528.82154: stderr chunk (state=3): >>><<< 46400 1727204528.82157: stdout chunk (state=3): >>><<< 46400 1727204528.82183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204528.82187: _low_level_execute_command(): starting 46400 1727204528.82190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/AnsiballZ_stat.py && sleep 0' 46400 1727204528.82890: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.82899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.82912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.82925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.82963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.82975: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.82985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.82998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.83005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.83011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.83019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.83028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.83039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.83046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.83052: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.83061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.83145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.83150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.83158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.83233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204528.96540: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204528.97598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204528.97602: stdout chunk (state=3): >>><<< 46400 1727204528.97604: stderr chunk (state=3): >>><<< 46400 1727204528.97742: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204528.97746: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204528.97750: _low_level_execute_command(): starting 46400 1727204528.97752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204528.7225628-47670-148373357579004/ > /dev/null 2>&1 && sleep 0' 46400 1727204528.98372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204528.98392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.98410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.98429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.98478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.98489: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204528.98509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.98525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204528.98536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204528.98549: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204528.98581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204528.98595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204528.98616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204528.98627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204528.98637: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204528.98649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204528.98733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204528.98756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204528.98774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204528.98850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204529.00754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204529.00759: stdout chunk (state=3): >>><<< 46400 1727204529.00781: stderr chunk (state=3): >>><<< 46400 1727204529.01176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204529.01180: handler run complete 46400 1727204529.01182: attempt loop complete, returning result 46400 1727204529.01184: _execute() done 46400 1727204529.01186: dumping result to json 46400 1727204529.01188: done dumping result, returning 46400 1727204529.01190: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000000691] 46400 1727204529.01192: sending task result for task 0affcd87-79f5-1303-fda8-000000000691 46400 1727204529.01266: done sending task result for task 0affcd87-79f5-1303-fda8-000000000691 46400 1727204529.01270: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204529.01431: no more pending results, returning what we have 46400 1727204529.01436: results queue empty 46400 1727204529.01437: checking for any_errors_fatal 46400 1727204529.01439: done checking for any_errors_fatal 46400 1727204529.01440: checking for max_fail_percentage 46400 1727204529.01442: done checking for max_fail_percentage 46400 1727204529.01443: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.01444: done checking to see if all hosts have failed 46400 1727204529.01445: getting the remaining hosts for this loop 46400 1727204529.01447: done getting the remaining hosts for this loop 46400 1727204529.01451: getting the next task for host managed-node2 46400 1727204529.01514: done getting next task for host managed-node2 46400 1727204529.01518: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 46400 1727204529.01523: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.01527: getting variables 46400 1727204529.01529: in VariableManager get_vars() 46400 1727204529.01583: Calling all_inventory to load vars for managed-node2 46400 1727204529.01586: Calling groups_inventory to load vars for managed-node2 46400 1727204529.01591: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.01604: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.01607: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.01610: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.03338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.05422: done with get_vars() 46400 1727204529.05453: done getting variables 46400 1727204529.05535: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204529.05688: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.376) 0:00:19.341 ***** 46400 1727204529.05730: entering _queue_task() for managed-node2/assert 46400 1727204529.06090: worker is 1 (out of 1 available) 46400 1727204529.06104: exiting _queue_task() for managed-node2/assert 46400 1727204529.06117: done queuing things up, now waiting for results queue to drain 46400 1727204529.06119: waiting for pending results... 46400 1727204529.06370: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 46400 1727204529.06473: in run() - task 0affcd87-79f5-1303-fda8-000000000643 46400 1727204529.06489: variable 'ansible_search_path' from source: unknown 46400 1727204529.06492: variable 'ansible_search_path' from source: unknown 46400 1727204529.06551: calling self._execute() 46400 1727204529.06659: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.06677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.06710: variable 'omit' from source: magic vars 46400 1727204529.07120: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.07142: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.07159: variable 'omit' from source: magic vars 46400 1727204529.07227: variable 'omit' from source: magic vars 46400 1727204529.07337: variable 'interface' from source: play vars 46400 1727204529.07364: variable 'omit' from source: magic vars 46400 1727204529.07413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204529.07453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204529.07487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204529.07508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204529.07523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204529.07556: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204529.07570: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.07581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.07686: Set connection var ansible_shell_type to sh 46400 1727204529.07708: Set connection var ansible_shell_executable to /bin/sh 46400 1727204529.07727: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204529.07738: Set connection var ansible_connection to ssh 46400 1727204529.07751: Set connection var ansible_pipelining to False 46400 1727204529.07768: Set connection var ansible_timeout to 10 46400 1727204529.07803: variable 'ansible_shell_executable' from source: unknown 46400 1727204529.07816: variable 'ansible_connection' from source: unknown 46400 1727204529.07824: variable 'ansible_module_compression' from source: unknown 46400 1727204529.07830: variable 'ansible_shell_type' from source: unknown 46400 1727204529.07836: variable 'ansible_shell_executable' from source: unknown 46400 1727204529.07842: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.07848: variable 'ansible_pipelining' from source: unknown 46400 1727204529.07854: variable 'ansible_timeout' from source: unknown 46400 1727204529.07863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.08010: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204529.08030: variable 'omit' from source: magic vars 46400 1727204529.08040: starting attempt loop 46400 1727204529.08045: running the handler 46400 1727204529.08197: variable 'interface_stat' from source: set_fact 46400 1727204529.08215: Evaluated conditional (not interface_stat.stat.exists): True 46400 1727204529.08226: handler run complete 46400 1727204529.08249: attempt loop complete, returning result 46400 1727204529.08256: _execute() done 46400 1727204529.08266: dumping result to json 46400 1727204529.08273: done dumping result, returning 46400 1727204529.08283: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [0affcd87-79f5-1303-fda8-000000000643] 46400 1727204529.08292: sending task result for task 0affcd87-79f5-1303-fda8-000000000643 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204529.08434: no more pending results, returning what we have 46400 1727204529.08438: results queue empty 46400 1727204529.08440: checking for any_errors_fatal 46400 1727204529.08449: done checking for any_errors_fatal 46400 1727204529.08449: checking for max_fail_percentage 46400 1727204529.08451: done checking for max_fail_percentage 46400 1727204529.08452: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.08453: done checking to see if all hosts have failed 46400 1727204529.08453: getting the remaining hosts for this loop 46400 1727204529.08455: done getting the remaining hosts for this loop 46400 1727204529.08459: getting the next task for host managed-node2 46400 1727204529.08471: done getting next task for host managed-node2 46400 1727204529.08474: ^ task is: TASK: Test 46400 1727204529.08476: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.08480: getting variables 46400 1727204529.08482: in VariableManager get_vars() 46400 1727204529.08515: Calling all_inventory to load vars for managed-node2 46400 1727204529.08517: Calling groups_inventory to load vars for managed-node2 46400 1727204529.08521: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.08534: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.08536: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.08539: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.09278: done sending task result for task 0affcd87-79f5-1303-fda8-000000000643 46400 1727204529.09281: WORKER PROCESS EXITING 46400 1727204529.10062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.11010: done with get_vars() 46400 1727204529.11045: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.054) 0:00:19.396 ***** 46400 1727204529.11162: entering _queue_task() for managed-node2/include_tasks 46400 1727204529.11511: worker is 1 (out of 1 available) 46400 1727204529.11529: exiting _queue_task() for managed-node2/include_tasks 46400 1727204529.11541: done queuing things up, now waiting for results queue to drain 46400 1727204529.11543: waiting for pending results... 46400 1727204529.11980: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204529.11989: in run() - task 0affcd87-79f5-1303-fda8-0000000005b8 46400 1727204529.11993: variable 'ansible_search_path' from source: unknown 46400 1727204529.12002: variable 'ansible_search_path' from source: unknown 46400 1727204529.12049: variable 'lsr_test' from source: include params 46400 1727204529.12253: variable 'lsr_test' from source: include params 46400 1727204529.12330: variable 'omit' from source: magic vars 46400 1727204529.12486: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.12502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.12518: variable 'omit' from source: magic vars 46400 1727204529.12786: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.12801: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.12812: variable 'item' from source: unknown 46400 1727204529.12891: variable 'item' from source: unknown 46400 1727204529.12925: variable 'item' from source: unknown 46400 1727204529.12999: variable 'item' from source: unknown 46400 1727204529.13147: dumping result to json 46400 1727204529.13156: done dumping result, returning 46400 1727204529.13169: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-0000000005b8] 46400 1727204529.13180: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b8 46400 1727204529.13282: no more pending results, returning what we have 46400 1727204529.13288: in VariableManager get_vars() 46400 1727204529.13329: Calling all_inventory to load vars for managed-node2 46400 1727204529.13332: Calling groups_inventory to load vars for managed-node2 46400 1727204529.13336: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.13351: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.13354: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.13357: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.14506: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b8 46400 1727204529.14511: WORKER PROCESS EXITING 46400 1727204529.14523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.15542: done with get_vars() 46400 1727204529.15562: variable 'ansible_search_path' from source: unknown 46400 1727204529.15566: variable 'ansible_search_path' from source: unknown 46400 1727204529.15607: we have included files to process 46400 1727204529.15608: generating all_blocks data 46400 1727204529.15610: done generating all_blocks data 46400 1727204529.15614: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 46400 1727204529.15615: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 46400 1727204529.15617: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 46400 1727204529.15914: done processing included file 46400 1727204529.15916: iterating over new_blocks loaded from include file 46400 1727204529.15917: in VariableManager get_vars() 46400 1727204529.15932: done with get_vars() 46400 1727204529.15933: filtering new block on tags 46400 1727204529.15963: done filtering new block on tags 46400 1727204529.15966: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed-node2 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 46400 1727204529.15971: extending task lists for all hosts with included blocks 46400 1727204529.16722: done extending task lists 46400 1727204529.16723: done processing included files 46400 1727204529.16724: results queue empty 46400 1727204529.16725: checking for any_errors_fatal 46400 1727204529.16728: done checking for any_errors_fatal 46400 1727204529.16728: checking for max_fail_percentage 46400 1727204529.16730: done checking for max_fail_percentage 46400 1727204529.16730: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.16731: done checking to see if all hosts have failed 46400 1727204529.16732: getting the remaining hosts for this loop 46400 1727204529.16733: done getting the remaining hosts for this loop 46400 1727204529.16735: getting the next task for host managed-node2 46400 1727204529.16739: done getting next task for host managed-node2 46400 1727204529.16741: ^ task is: TASK: Include network role 46400 1727204529.16744: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.16746: getting variables 46400 1727204529.16747: in VariableManager get_vars() 46400 1727204529.16756: Calling all_inventory to load vars for managed-node2 46400 1727204529.16758: Calling groups_inventory to load vars for managed-node2 46400 1727204529.16761: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.16768: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.16770: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.16773: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.17938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.19309: done with get_vars() 46400 1727204529.19325: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.082) 0:00:19.478 ***** 46400 1727204529.19388: entering _queue_task() for managed-node2/include_role 46400 1727204529.19626: worker is 1 (out of 1 available) 46400 1727204529.19640: exiting _queue_task() for managed-node2/include_role 46400 1727204529.19655: done queuing things up, now waiting for results queue to drain 46400 1727204529.19656: waiting for pending results... 46400 1727204529.19841: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204529.19925: in run() - task 0affcd87-79f5-1303-fda8-0000000006b1 46400 1727204529.19934: variable 'ansible_search_path' from source: unknown 46400 1727204529.19938: variable 'ansible_search_path' from source: unknown 46400 1727204529.19971: calling self._execute() 46400 1727204529.20044: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.20048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.20057: variable 'omit' from source: magic vars 46400 1727204529.20330: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.20339: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.20345: _execute() done 46400 1727204529.20349: dumping result to json 46400 1727204529.20352: done dumping result, returning 46400 1727204529.20355: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-0000000006b1] 46400 1727204529.20367: sending task result for task 0affcd87-79f5-1303-fda8-0000000006b1 46400 1727204529.20478: done sending task result for task 0affcd87-79f5-1303-fda8-0000000006b1 46400 1727204529.20480: WORKER PROCESS EXITING 46400 1727204529.20504: no more pending results, returning what we have 46400 1727204529.20509: in VariableManager get_vars() 46400 1727204529.20545: Calling all_inventory to load vars for managed-node2 46400 1727204529.20548: Calling groups_inventory to load vars for managed-node2 46400 1727204529.20552: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.20567: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.20570: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.20573: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.21368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.22304: done with get_vars() 46400 1727204529.22326: variable 'ansible_search_path' from source: unknown 46400 1727204529.22327: variable 'ansible_search_path' from source: unknown 46400 1727204529.22452: variable 'omit' from source: magic vars 46400 1727204529.22483: variable 'omit' from source: magic vars 46400 1727204529.22492: variable 'omit' from source: magic vars 46400 1727204529.22495: we have included files to process 46400 1727204529.22496: generating all_blocks data 46400 1727204529.22497: done generating all_blocks data 46400 1727204529.22498: processing included file: fedora.linux_system_roles.network 46400 1727204529.22513: in VariableManager get_vars() 46400 1727204529.22525: done with get_vars() 46400 1727204529.22547: in VariableManager get_vars() 46400 1727204529.22558: done with get_vars() 46400 1727204529.22589: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204529.22663: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204529.22713: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204529.22992: in VariableManager get_vars() 46400 1727204529.23005: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204529.24297: iterating over new_blocks loaded from include file 46400 1727204529.24299: in VariableManager get_vars() 46400 1727204529.24311: done with get_vars() 46400 1727204529.24312: filtering new block on tags 46400 1727204529.24483: done filtering new block on tags 46400 1727204529.24487: in VariableManager get_vars() 46400 1727204529.24497: done with get_vars() 46400 1727204529.24499: filtering new block on tags 46400 1727204529.24509: done filtering new block on tags 46400 1727204529.24511: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204529.24515: extending task lists for all hosts with included blocks 46400 1727204529.24616: done extending task lists 46400 1727204529.24617: done processing included files 46400 1727204529.24617: results queue empty 46400 1727204529.24618: checking for any_errors_fatal 46400 1727204529.24620: done checking for any_errors_fatal 46400 1727204529.24621: checking for max_fail_percentage 46400 1727204529.24621: done checking for max_fail_percentage 46400 1727204529.24622: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.24622: done checking to see if all hosts have failed 46400 1727204529.24623: getting the remaining hosts for this loop 46400 1727204529.24624: done getting the remaining hosts for this loop 46400 1727204529.24625: getting the next task for host managed-node2 46400 1727204529.24628: done getting next task for host managed-node2 46400 1727204529.24630: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204529.24633: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.24639: getting variables 46400 1727204529.24640: in VariableManager get_vars() 46400 1727204529.24648: Calling all_inventory to load vars for managed-node2 46400 1727204529.24649: Calling groups_inventory to load vars for managed-node2 46400 1727204529.24650: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.24654: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.24656: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.24657: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.25382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.26289: done with get_vars() 46400 1727204529.26304: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.069) 0:00:19.548 ***** 46400 1727204529.26357: entering _queue_task() for managed-node2/include_tasks 46400 1727204529.26616: worker is 1 (out of 1 available) 46400 1727204529.26631: exiting _queue_task() for managed-node2/include_tasks 46400 1727204529.26643: done queuing things up, now waiting for results queue to drain 46400 1727204529.26645: waiting for pending results... 46400 1727204529.26829: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204529.26917: in run() - task 0affcd87-79f5-1303-fda8-00000000072f 46400 1727204529.26928: variable 'ansible_search_path' from source: unknown 46400 1727204529.26932: variable 'ansible_search_path' from source: unknown 46400 1727204529.26960: calling self._execute() 46400 1727204529.27034: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.27038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.27047: variable 'omit' from source: magic vars 46400 1727204529.27318: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.27327: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.27333: _execute() done 46400 1727204529.27336: dumping result to json 46400 1727204529.27339: done dumping result, returning 46400 1727204529.27346: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-00000000072f] 46400 1727204529.27352: sending task result for task 0affcd87-79f5-1303-fda8-00000000072f 46400 1727204529.27437: done sending task result for task 0affcd87-79f5-1303-fda8-00000000072f 46400 1727204529.27440: WORKER PROCESS EXITING 46400 1727204529.27485: no more pending results, returning what we have 46400 1727204529.27490: in VariableManager get_vars() 46400 1727204529.27529: Calling all_inventory to load vars for managed-node2 46400 1727204529.27531: Calling groups_inventory to load vars for managed-node2 46400 1727204529.27533: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.27545: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.27548: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.27558: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.28367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.29400: done with get_vars() 46400 1727204529.29414: variable 'ansible_search_path' from source: unknown 46400 1727204529.29415: variable 'ansible_search_path' from source: unknown 46400 1727204529.29441: we have included files to process 46400 1727204529.29442: generating all_blocks data 46400 1727204529.29443: done generating all_blocks data 46400 1727204529.29445: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204529.29446: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204529.29447: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204529.29833: done processing included file 46400 1727204529.29834: iterating over new_blocks loaded from include file 46400 1727204529.29835: in VariableManager get_vars() 46400 1727204529.29851: done with get_vars() 46400 1727204529.29852: filtering new block on tags 46400 1727204529.29875: done filtering new block on tags 46400 1727204529.29876: in VariableManager get_vars() 46400 1727204529.29890: done with get_vars() 46400 1727204529.29891: filtering new block on tags 46400 1727204529.29917: done filtering new block on tags 46400 1727204529.29918: in VariableManager get_vars() 46400 1727204529.29931: done with get_vars() 46400 1727204529.29932: filtering new block on tags 46400 1727204529.29959: done filtering new block on tags 46400 1727204529.29961: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204529.29967: extending task lists for all hosts with included blocks 46400 1727204529.30996: done extending task lists 46400 1727204529.30997: done processing included files 46400 1727204529.30997: results queue empty 46400 1727204529.30998: checking for any_errors_fatal 46400 1727204529.31000: done checking for any_errors_fatal 46400 1727204529.31001: checking for max_fail_percentage 46400 1727204529.31002: done checking for max_fail_percentage 46400 1727204529.31002: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.31003: done checking to see if all hosts have failed 46400 1727204529.31003: getting the remaining hosts for this loop 46400 1727204529.31004: done getting the remaining hosts for this loop 46400 1727204529.31006: getting the next task for host managed-node2 46400 1727204529.31009: done getting next task for host managed-node2 46400 1727204529.31011: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204529.31014: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.31020: getting variables 46400 1727204529.31021: in VariableManager get_vars() 46400 1727204529.31031: Calling all_inventory to load vars for managed-node2 46400 1727204529.31033: Calling groups_inventory to load vars for managed-node2 46400 1727204529.31034: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.31038: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.31040: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.31041: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.31682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.33175: done with get_vars() 46400 1727204529.33193: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.069) 0:00:19.617 ***** 46400 1727204529.33282: entering _queue_task() for managed-node2/setup 46400 1727204529.33534: worker is 1 (out of 1 available) 46400 1727204529.33548: exiting _queue_task() for managed-node2/setup 46400 1727204529.33566: done queuing things up, now waiting for results queue to drain 46400 1727204529.33568: waiting for pending results... 46400 1727204529.33755: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204529.33859: in run() - task 0affcd87-79f5-1303-fda8-00000000078c 46400 1727204529.33873: variable 'ansible_search_path' from source: unknown 46400 1727204529.33877: variable 'ansible_search_path' from source: unknown 46400 1727204529.33906: calling self._execute() 46400 1727204529.33976: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.33980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.33990: variable 'omit' from source: magic vars 46400 1727204529.34271: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.34280: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.34463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204529.37746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204529.37819: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204529.37854: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204529.37898: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204529.37924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204529.38022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204529.38051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204529.38088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204529.38128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204529.38153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204529.38218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204529.38240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204529.38267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204529.38306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204529.38321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204529.38506: variable '__network_required_facts' from source: role '' defaults 46400 1727204529.38522: variable 'ansible_facts' from source: unknown 46400 1727204529.39337: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204529.39343: when evaluation is False, skipping this task 46400 1727204529.39346: _execute() done 46400 1727204529.39348: dumping result to json 46400 1727204529.39350: done dumping result, returning 46400 1727204529.39358: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-00000000078c] 46400 1727204529.39367: sending task result for task 0affcd87-79f5-1303-fda8-00000000078c 46400 1727204529.39476: done sending task result for task 0affcd87-79f5-1303-fda8-00000000078c 46400 1727204529.39478: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204529.39526: no more pending results, returning what we have 46400 1727204529.39531: results queue empty 46400 1727204529.39532: checking for any_errors_fatal 46400 1727204529.39533: done checking for any_errors_fatal 46400 1727204529.39537: checking for max_fail_percentage 46400 1727204529.39539: done checking for max_fail_percentage 46400 1727204529.39540: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.39540: done checking to see if all hosts have failed 46400 1727204529.39541: getting the remaining hosts for this loop 46400 1727204529.39543: done getting the remaining hosts for this loop 46400 1727204529.39547: getting the next task for host managed-node2 46400 1727204529.39557: done getting next task for host managed-node2 46400 1727204529.39561: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204529.39568: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.39587: getting variables 46400 1727204529.39590: in VariableManager get_vars() 46400 1727204529.39630: Calling all_inventory to load vars for managed-node2 46400 1727204529.39633: Calling groups_inventory to load vars for managed-node2 46400 1727204529.39635: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.39646: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.39649: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.39658: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.41205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.43322: done with get_vars() 46400 1727204529.43349: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.101) 0:00:19.719 ***** 46400 1727204529.43454: entering _queue_task() for managed-node2/stat 46400 1727204529.43924: worker is 1 (out of 1 available) 46400 1727204529.43938: exiting _queue_task() for managed-node2/stat 46400 1727204529.43997: done queuing things up, now waiting for results queue to drain 46400 1727204529.43999: waiting for pending results... 46400 1727204529.44425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204529.44623: in run() - task 0affcd87-79f5-1303-fda8-00000000078e 46400 1727204529.44635: variable 'ansible_search_path' from source: unknown 46400 1727204529.44639: variable 'ansible_search_path' from source: unknown 46400 1727204529.44726: calling self._execute() 46400 1727204529.44810: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.44816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.44825: variable 'omit' from source: magic vars 46400 1727204529.45184: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.45198: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.45669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204529.45935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204529.45981: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204529.46013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204529.46044: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204529.46196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204529.46220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204529.46245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204529.46275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204529.46362: variable '__network_is_ostree' from source: set_fact 46400 1727204529.46374: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204529.46377: when evaluation is False, skipping this task 46400 1727204529.46380: _execute() done 46400 1727204529.46384: dumping result to json 46400 1727204529.46386: done dumping result, returning 46400 1727204529.46394: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-00000000078e] 46400 1727204529.46399: sending task result for task 0affcd87-79f5-1303-fda8-00000000078e 46400 1727204529.46499: done sending task result for task 0affcd87-79f5-1303-fda8-00000000078e 46400 1727204529.46504: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204529.46562: no more pending results, returning what we have 46400 1727204529.46569: results queue empty 46400 1727204529.46570: checking for any_errors_fatal 46400 1727204529.46581: done checking for any_errors_fatal 46400 1727204529.46582: checking for max_fail_percentage 46400 1727204529.46584: done checking for max_fail_percentage 46400 1727204529.46585: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.46585: done checking to see if all hosts have failed 46400 1727204529.46586: getting the remaining hosts for this loop 46400 1727204529.46588: done getting the remaining hosts for this loop 46400 1727204529.46592: getting the next task for host managed-node2 46400 1727204529.46601: done getting next task for host managed-node2 46400 1727204529.46605: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204529.46610: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.46626: getting variables 46400 1727204529.46628: in VariableManager get_vars() 46400 1727204529.46663: Calling all_inventory to load vars for managed-node2 46400 1727204529.46667: Calling groups_inventory to load vars for managed-node2 46400 1727204529.46669: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.46680: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.46682: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.46685: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.49123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.52983: done with get_vars() 46400 1727204529.53017: done getting variables 46400 1727204529.53201: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.097) 0:00:19.816 ***** 46400 1727204529.53243: entering _queue_task() for managed-node2/set_fact 46400 1727204529.53922: worker is 1 (out of 1 available) 46400 1727204529.53936: exiting _queue_task() for managed-node2/set_fact 46400 1727204529.53950: done queuing things up, now waiting for results queue to drain 46400 1727204529.53951: waiting for pending results... 46400 1727204529.54846: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204529.55094: in run() - task 0affcd87-79f5-1303-fda8-00000000078f 46400 1727204529.55107: variable 'ansible_search_path' from source: unknown 46400 1727204529.55111: variable 'ansible_search_path' from source: unknown 46400 1727204529.55146: calling self._execute() 46400 1727204529.55233: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.55240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.55250: variable 'omit' from source: magic vars 46400 1727204529.55604: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.55617: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.56491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204529.56958: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204529.57210: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204529.57243: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204529.57277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204529.57358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204529.57689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204529.57715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204529.57740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204529.57832: variable '__network_is_ostree' from source: set_fact 46400 1727204529.57840: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204529.57843: when evaluation is False, skipping this task 46400 1727204529.57845: _execute() done 46400 1727204529.57848: dumping result to json 46400 1727204529.57851: done dumping result, returning 46400 1727204529.57860: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-00000000078f] 46400 1727204529.58072: sending task result for task 0affcd87-79f5-1303-fda8-00000000078f 46400 1727204529.58168: done sending task result for task 0affcd87-79f5-1303-fda8-00000000078f 46400 1727204529.58170: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204529.58217: no more pending results, returning what we have 46400 1727204529.58221: results queue empty 46400 1727204529.58222: checking for any_errors_fatal 46400 1727204529.58230: done checking for any_errors_fatal 46400 1727204529.58230: checking for max_fail_percentage 46400 1727204529.58232: done checking for max_fail_percentage 46400 1727204529.58233: checking to see if all hosts have failed and the running result is not ok 46400 1727204529.58234: done checking to see if all hosts have failed 46400 1727204529.58235: getting the remaining hosts for this loop 46400 1727204529.58236: done getting the remaining hosts for this loop 46400 1727204529.58240: getting the next task for host managed-node2 46400 1727204529.58252: done getting next task for host managed-node2 46400 1727204529.58255: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204529.58261: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204529.58282: getting variables 46400 1727204529.58285: in VariableManager get_vars() 46400 1727204529.58323: Calling all_inventory to load vars for managed-node2 46400 1727204529.58326: Calling groups_inventory to load vars for managed-node2 46400 1727204529.58329: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204529.58340: Calling all_plugins_play to load vars for managed-node2 46400 1727204529.58342: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204529.58345: Calling groups_plugins_play to load vars for managed-node2 46400 1727204529.61006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204529.64592: done with get_vars() 46400 1727204529.64737: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.117) 0:00:19.934 ***** 46400 1727204529.64961: entering _queue_task() for managed-node2/service_facts 46400 1727204529.65637: worker is 1 (out of 1 available) 46400 1727204529.65652: exiting _queue_task() for managed-node2/service_facts 46400 1727204529.65667: done queuing things up, now waiting for results queue to drain 46400 1727204529.65669: waiting for pending results... 46400 1727204529.66878: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204529.66993: in run() - task 0affcd87-79f5-1303-fda8-000000000791 46400 1727204529.67005: variable 'ansible_search_path' from source: unknown 46400 1727204529.67009: variable 'ansible_search_path' from source: unknown 46400 1727204529.67043: calling self._execute() 46400 1727204529.67133: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.67139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.67150: variable 'omit' from source: magic vars 46400 1727204529.67905: variable 'ansible_distribution_major_version' from source: facts 46400 1727204529.67917: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204529.67924: variable 'omit' from source: magic vars 46400 1727204529.68001: variable 'omit' from source: magic vars 46400 1727204529.68035: variable 'omit' from source: magic vars 46400 1727204529.68083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204529.68120: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204529.68143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204529.68161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204529.68178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204529.68209: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204529.68212: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.68214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.68316: Set connection var ansible_shell_type to sh 46400 1727204529.68327: Set connection var ansible_shell_executable to /bin/sh 46400 1727204529.68333: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204529.68339: Set connection var ansible_connection to ssh 46400 1727204529.68345: Set connection var ansible_pipelining to False 46400 1727204529.68351: Set connection var ansible_timeout to 10 46400 1727204529.68382: variable 'ansible_shell_executable' from source: unknown 46400 1727204529.68388: variable 'ansible_connection' from source: unknown 46400 1727204529.68390: variable 'ansible_module_compression' from source: unknown 46400 1727204529.68393: variable 'ansible_shell_type' from source: unknown 46400 1727204529.68396: variable 'ansible_shell_executable' from source: unknown 46400 1727204529.68399: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204529.68401: variable 'ansible_pipelining' from source: unknown 46400 1727204529.68403: variable 'ansible_timeout' from source: unknown 46400 1727204529.68405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204529.69488: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204529.69498: variable 'omit' from source: magic vars 46400 1727204529.69501: starting attempt loop 46400 1727204529.69505: running the handler 46400 1727204529.69520: _low_level_execute_command(): starting 46400 1727204529.69529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204529.71508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204529.71520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.71531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.71546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.71591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.71599: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204529.71610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.71623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204529.71631: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204529.71639: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204529.71646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.72091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.72095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.72103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.72110: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204529.72143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.72196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204529.72216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204529.72229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204529.72307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204529.73971: stdout chunk (state=3): >>>/root <<< 46400 1727204529.74081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204529.74151: stderr chunk (state=3): >>><<< 46400 1727204529.74155: stdout chunk (state=3): >>><<< 46400 1727204529.74182: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204529.74196: _low_level_execute_command(): starting 46400 1727204529.74202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641 `" && echo ansible-tmp-1727204529.74182-47785-140750994921641="` echo /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641 `" ) && sleep 0' 46400 1727204529.74822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204529.74831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.74843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.74855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.74897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.74904: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204529.74914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.74927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204529.74934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204529.74972: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204529.74975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.74977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.74982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.74990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.74997: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204529.75006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.75126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204529.75130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204529.75132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204529.75481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204529.77192: stdout chunk (state=3): >>>ansible-tmp-1727204529.74182-47785-140750994921641=/root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641 <<< 46400 1727204529.77410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204529.77414: stdout chunk (state=3): >>><<< 46400 1727204529.77421: stderr chunk (state=3): >>><<< 46400 1727204529.77438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204529.74182-47785-140750994921641=/root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204529.77493: variable 'ansible_module_compression' from source: unknown 46400 1727204529.77537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204529.77579: variable 'ansible_facts' from source: unknown 46400 1727204529.77670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/AnsiballZ_service_facts.py 46400 1727204529.77813: Sending initial data 46400 1727204529.77817: Sent initial data (160 bytes) 46400 1727204529.78791: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204529.78799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.78809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.78822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.78863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.78876: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204529.78887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.78900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204529.78907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204529.78914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204529.78922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.78932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.78944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.78956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.78959: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204529.78972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.79040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204529.79055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204529.79060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204529.79251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204529.80858: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204529.80908: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204529.80947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpr4n1vn2_ /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/AnsiballZ_service_facts.py <<< 46400 1727204529.80985: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204529.82151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204529.82242: stderr chunk (state=3): >>><<< 46400 1727204529.82245: stdout chunk (state=3): >>><<< 46400 1727204529.82277: done transferring module to remote 46400 1727204529.82290: _low_level_execute_command(): starting 46400 1727204529.82295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/ /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/AnsiballZ_service_facts.py && sleep 0' 46400 1727204529.83222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204529.83226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.83228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.83231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.83233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.83235: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204529.83237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.83240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204529.83243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204529.83245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204529.83247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.83250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.83252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.83254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.83256: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204529.83258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.84270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204529.84273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204529.84276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204529.84278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204529.85250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204529.85253: stdout chunk (state=3): >>><<< 46400 1727204529.85255: stderr chunk (state=3): >>><<< 46400 1727204529.85258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204529.85260: _low_level_execute_command(): starting 46400 1727204529.85262: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/AnsiballZ_service_facts.py && sleep 0' 46400 1727204529.86507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204529.86521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.86536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.86553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.86597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.86686: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204529.86734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.86753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204529.86768: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204529.86780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204529.86792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204529.86806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204529.86821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204529.86869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204529.86882: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204529.86917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204529.86993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204529.87199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204529.87214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204529.87340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.17046: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204531.17082: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 46400 1727204531.17089: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204531.18381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204531.18450: stderr chunk (state=3): >>><<< 46400 1727204531.18454: stdout chunk (state=3): >>><<< 46400 1727204531.18486: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204531.19168: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204531.19178: _low_level_execute_command(): starting 46400 1727204531.19183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204529.74182-47785-140750994921641/ > /dev/null 2>&1 && sleep 0' 46400 1727204531.19845: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204531.19850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.19871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.19901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.19904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.19923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.19927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.19983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.19988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.20000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.20051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.21933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204531.21954: stderr chunk (state=3): >>><<< 46400 1727204531.21957: stdout chunk (state=3): >>><<< 46400 1727204531.22070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204531.22074: handler run complete 46400 1727204531.22158: variable 'ansible_facts' from source: unknown 46400 1727204531.22412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204531.22737: variable 'ansible_facts' from source: unknown 46400 1727204531.22813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204531.22920: attempt loop complete, returning result 46400 1727204531.22924: _execute() done 46400 1727204531.22927: dumping result to json 46400 1727204531.22959: done dumping result, returning 46400 1727204531.22970: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000000791] 46400 1727204531.22975: sending task result for task 0affcd87-79f5-1303-fda8-000000000791 46400 1727204531.23582: done sending task result for task 0affcd87-79f5-1303-fda8-000000000791 46400 1727204531.23585: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204531.23654: no more pending results, returning what we have 46400 1727204531.23657: results queue empty 46400 1727204531.23658: checking for any_errors_fatal 46400 1727204531.23663: done checking for any_errors_fatal 46400 1727204531.23666: checking for max_fail_percentage 46400 1727204531.23668: done checking for max_fail_percentage 46400 1727204531.23669: checking to see if all hosts have failed and the running result is not ok 46400 1727204531.23669: done checking to see if all hosts have failed 46400 1727204531.23670: getting the remaining hosts for this loop 46400 1727204531.23671: done getting the remaining hosts for this loop 46400 1727204531.23674: getting the next task for host managed-node2 46400 1727204531.23679: done getting next task for host managed-node2 46400 1727204531.23682: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204531.23686: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204531.23693: getting variables 46400 1727204531.23694: in VariableManager get_vars() 46400 1727204531.23717: Calling all_inventory to load vars for managed-node2 46400 1727204531.23719: Calling groups_inventory to load vars for managed-node2 46400 1727204531.23721: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204531.23728: Calling all_plugins_play to load vars for managed-node2 46400 1727204531.23733: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204531.23735: Calling groups_plugins_play to load vars for managed-node2 46400 1727204531.24678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204531.26303: done with get_vars() 46400 1727204531.26328: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:11 -0400 (0:00:01.614) 0:00:21.548 ***** 46400 1727204531.26430: entering _queue_task() for managed-node2/package_facts 46400 1727204531.26954: worker is 1 (out of 1 available) 46400 1727204531.26972: exiting _queue_task() for managed-node2/package_facts 46400 1727204531.26986: done queuing things up, now waiting for results queue to drain 46400 1727204531.26988: waiting for pending results... 46400 1727204531.27388: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204531.27488: in run() - task 0affcd87-79f5-1303-fda8-000000000792 46400 1727204531.27506: variable 'ansible_search_path' from source: unknown 46400 1727204531.27509: variable 'ansible_search_path' from source: unknown 46400 1727204531.27549: calling self._execute() 46400 1727204531.27647: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204531.27653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204531.27668: variable 'omit' from source: magic vars 46400 1727204531.28072: variable 'ansible_distribution_major_version' from source: facts 46400 1727204531.28190: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204531.28194: variable 'omit' from source: magic vars 46400 1727204531.28199: variable 'omit' from source: magic vars 46400 1727204531.28327: variable 'omit' from source: magic vars 46400 1727204531.28330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204531.28333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204531.28336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204531.28354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204531.28375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204531.28409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204531.28412: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204531.28415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204531.28525: Set connection var ansible_shell_type to sh 46400 1727204531.28535: Set connection var ansible_shell_executable to /bin/sh 46400 1727204531.28541: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204531.28544: Set connection var ansible_connection to ssh 46400 1727204531.28551: Set connection var ansible_pipelining to False 46400 1727204531.28557: Set connection var ansible_timeout to 10 46400 1727204531.28595: variable 'ansible_shell_executable' from source: unknown 46400 1727204531.28602: variable 'ansible_connection' from source: unknown 46400 1727204531.28605: variable 'ansible_module_compression' from source: unknown 46400 1727204531.28608: variable 'ansible_shell_type' from source: unknown 46400 1727204531.28613: variable 'ansible_shell_executable' from source: unknown 46400 1727204531.28615: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204531.28662: variable 'ansible_pipelining' from source: unknown 46400 1727204531.28712: variable 'ansible_timeout' from source: unknown 46400 1727204531.28718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204531.29947: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204531.29957: variable 'omit' from source: magic vars 46400 1727204531.29960: starting attempt loop 46400 1727204531.29968: running the handler 46400 1727204531.30002: _low_level_execute_command(): starting 46400 1727204531.30010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204531.31365: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204531.31672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.31678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.31682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.31685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.31687: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204531.31689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.31691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204531.31693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204531.31696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204531.31698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.31700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.31703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.31705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.31706: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204531.31708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.31712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.31714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.31716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.31781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.33317: stdout chunk (state=3): >>>/root <<< 46400 1727204531.33489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204531.33493: stdout chunk (state=3): >>><<< 46400 1727204531.33503: stderr chunk (state=3): >>><<< 46400 1727204531.33527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204531.33540: _low_level_execute_command(): starting 46400 1727204531.33546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778 `" && echo ansible-tmp-1727204531.3352525-47917-74613643576778="` echo /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778 `" ) && sleep 0' 46400 1727204531.34681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204531.34690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.34700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.34715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.34750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.34760: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204531.34777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.34790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204531.34797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204531.34804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204531.34812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.34820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.34832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.34839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.34846: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204531.34855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.34934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.34949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.34961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.35033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.36920: stdout chunk (state=3): >>>ansible-tmp-1727204531.3352525-47917-74613643576778=/root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778 <<< 46400 1727204531.37070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204531.37073: stdout chunk (state=3): >>><<< 46400 1727204531.37076: stderr chunk (state=3): >>><<< 46400 1727204531.37275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204531.3352525-47917-74613643576778=/root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204531.37278: variable 'ansible_module_compression' from source: unknown 46400 1727204531.37280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204531.37282: variable 'ansible_facts' from source: unknown 46400 1727204531.37416: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/AnsiballZ_package_facts.py 46400 1727204531.37557: Sending initial data 46400 1727204531.37560: Sent initial data (161 bytes) 46400 1727204531.38318: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.38322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.38349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.38355: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204531.38371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.38382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204531.38388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.38397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.38404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.38452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.38482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.38487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.38527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.40242: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204531.40274: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204531.40304: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpkm0fju24 /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/AnsiballZ_package_facts.py <<< 46400 1727204531.40339: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204531.42883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204531.42972: stderr chunk (state=3): >>><<< 46400 1727204531.42994: stdout chunk (state=3): >>><<< 46400 1727204531.43018: done transferring module to remote 46400 1727204531.43030: _low_level_execute_command(): starting 46400 1727204531.43036: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/ /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/AnsiballZ_package_facts.py && sleep 0' 46400 1727204531.43701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204531.43711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.43720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.43734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.43778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.43786: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204531.43796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.43810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204531.43817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204531.43826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204531.43831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.43840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.43852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.43859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.43874: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204531.43884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.43955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.43978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.43990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.44055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.45830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204531.45834: stdout chunk (state=3): >>><<< 46400 1727204531.45840: stderr chunk (state=3): >>><<< 46400 1727204531.45854: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204531.45858: _low_level_execute_command(): starting 46400 1727204531.45867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/AnsiballZ_package_facts.py && sleep 0' 46400 1727204531.46982: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.46987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.47033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.47038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.47052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.47060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.47139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.47143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.47161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.47235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204531.93779: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204531.93806: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204531.95354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204531.95385: stderr chunk (state=3): >>><<< 46400 1727204531.95388: stdout chunk (state=3): >>><<< 46400 1727204531.96072: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204531.97947: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204531.97980: _low_level_execute_command(): starting 46400 1727204531.97989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204531.3352525-47917-74613643576778/ > /dev/null 2>&1 && sleep 0' 46400 1727204531.98653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204531.98676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.98692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.98708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.98751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.98771: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204531.98787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.98804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204531.98815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204531.98826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204531.98839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204531.98851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204531.98872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204531.98887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204531.98892: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204531.98902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204531.98978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204531.98995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204531.99003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204531.99076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204532.00988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204532.00992: stdout chunk (state=3): >>><<< 46400 1727204532.00995: stderr chunk (state=3): >>><<< 46400 1727204532.01542: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204532.01546: handler run complete 46400 1727204532.01967: variable 'ansible_facts' from source: unknown 46400 1727204532.02446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.04133: variable 'ansible_facts' from source: unknown 46400 1727204532.04394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.04841: attempt loop complete, returning result 46400 1727204532.04850: _execute() done 46400 1727204532.04853: dumping result to json 46400 1727204532.04985: done dumping result, returning 46400 1727204532.04992: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000000792] 46400 1727204532.04998: sending task result for task 0affcd87-79f5-1303-fda8-000000000792 46400 1727204532.06943: done sending task result for task 0affcd87-79f5-1303-fda8-000000000792 46400 1727204532.06947: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204532.07037: no more pending results, returning what we have 46400 1727204532.07039: results queue empty 46400 1727204532.07040: checking for any_errors_fatal 46400 1727204532.07044: done checking for any_errors_fatal 46400 1727204532.07044: checking for max_fail_percentage 46400 1727204532.07045: done checking for max_fail_percentage 46400 1727204532.07046: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.07047: done checking to see if all hosts have failed 46400 1727204532.07047: getting the remaining hosts for this loop 46400 1727204532.07048: done getting the remaining hosts for this loop 46400 1727204532.07051: getting the next task for host managed-node2 46400 1727204532.07057: done getting next task for host managed-node2 46400 1727204532.07060: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204532.07066: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.07074: getting variables 46400 1727204532.07075: in VariableManager get_vars() 46400 1727204532.07098: Calling all_inventory to load vars for managed-node2 46400 1727204532.07100: Calling groups_inventory to load vars for managed-node2 46400 1727204532.07106: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.07112: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.07114: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.07116: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.07850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.09148: done with get_vars() 46400 1727204532.09180: done getting variables 46400 1727204532.09245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.828) 0:00:22.377 ***** 46400 1727204532.09289: entering _queue_task() for managed-node2/debug 46400 1727204532.09619: worker is 1 (out of 1 available) 46400 1727204532.09634: exiting _queue_task() for managed-node2/debug 46400 1727204532.09650: done queuing things up, now waiting for results queue to drain 46400 1727204532.09652: waiting for pending results... 46400 1727204532.09957: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204532.10113: in run() - task 0affcd87-79f5-1303-fda8-000000000730 46400 1727204532.10187: variable 'ansible_search_path' from source: unknown 46400 1727204532.10195: variable 'ansible_search_path' from source: unknown 46400 1727204532.10237: calling self._execute() 46400 1727204532.10419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.10434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.10450: variable 'omit' from source: magic vars 46400 1727204532.10953: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.10975: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.10988: variable 'omit' from source: magic vars 46400 1727204532.11072: variable 'omit' from source: magic vars 46400 1727204532.11190: variable 'network_provider' from source: set_fact 46400 1727204532.11212: variable 'omit' from source: magic vars 46400 1727204532.11267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204532.11310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204532.11342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204532.11372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204532.11389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204532.11432: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204532.11443: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.11451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.11567: Set connection var ansible_shell_type to sh 46400 1727204532.11584: Set connection var ansible_shell_executable to /bin/sh 46400 1727204532.11595: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204532.11605: Set connection var ansible_connection to ssh 46400 1727204532.11615: Set connection var ansible_pipelining to False 46400 1727204532.11625: Set connection var ansible_timeout to 10 46400 1727204532.11668: variable 'ansible_shell_executable' from source: unknown 46400 1727204532.11678: variable 'ansible_connection' from source: unknown 46400 1727204532.11686: variable 'ansible_module_compression' from source: unknown 46400 1727204532.11692: variable 'ansible_shell_type' from source: unknown 46400 1727204532.11698: variable 'ansible_shell_executable' from source: unknown 46400 1727204532.11705: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.11712: variable 'ansible_pipelining' from source: unknown 46400 1727204532.11719: variable 'ansible_timeout' from source: unknown 46400 1727204532.11726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.11893: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204532.11910: variable 'omit' from source: magic vars 46400 1727204532.11920: starting attempt loop 46400 1727204532.11927: running the handler 46400 1727204532.11986: handler run complete 46400 1727204532.12007: attempt loop complete, returning result 46400 1727204532.12014: _execute() done 46400 1727204532.12021: dumping result to json 46400 1727204532.12028: done dumping result, returning 46400 1727204532.12041: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000000730] 46400 1727204532.12051: sending task result for task 0affcd87-79f5-1303-fda8-000000000730 46400 1727204532.12171: done sending task result for task 0affcd87-79f5-1303-fda8-000000000730 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204532.12239: no more pending results, returning what we have 46400 1727204532.12243: results queue empty 46400 1727204532.12244: checking for any_errors_fatal 46400 1727204532.12256: done checking for any_errors_fatal 46400 1727204532.12257: checking for max_fail_percentage 46400 1727204532.12259: done checking for max_fail_percentage 46400 1727204532.12259: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.12260: done checking to see if all hosts have failed 46400 1727204532.12261: getting the remaining hosts for this loop 46400 1727204532.12263: done getting the remaining hosts for this loop 46400 1727204532.12268: getting the next task for host managed-node2 46400 1727204532.12277: done getting next task for host managed-node2 46400 1727204532.12281: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204532.12287: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.12299: getting variables 46400 1727204532.12300: in VariableManager get_vars() 46400 1727204532.12336: Calling all_inventory to load vars for managed-node2 46400 1727204532.12338: Calling groups_inventory to load vars for managed-node2 46400 1727204532.12340: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.12352: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.12354: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.12357: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.12882: WORKER PROCESS EXITING 46400 1727204532.14480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.16476: done with get_vars() 46400 1727204532.16505: done getting variables 46400 1727204532.16570: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.073) 0:00:22.450 ***** 46400 1727204532.16613: entering _queue_task() for managed-node2/fail 46400 1727204532.16941: worker is 1 (out of 1 available) 46400 1727204532.16956: exiting _queue_task() for managed-node2/fail 46400 1727204532.16973: done queuing things up, now waiting for results queue to drain 46400 1727204532.16975: waiting for pending results... 46400 1727204532.17233: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204532.17356: in run() - task 0affcd87-79f5-1303-fda8-000000000731 46400 1727204532.17379: variable 'ansible_search_path' from source: unknown 46400 1727204532.17387: variable 'ansible_search_path' from source: unknown 46400 1727204532.17430: calling self._execute() 46400 1727204532.17534: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.17545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.17560: variable 'omit' from source: magic vars 46400 1727204532.17951: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.17975: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.18106: variable 'network_state' from source: role '' defaults 46400 1727204532.18120: Evaluated conditional (network_state != {}): False 46400 1727204532.18128: when evaluation is False, skipping this task 46400 1727204532.18135: _execute() done 46400 1727204532.18142: dumping result to json 46400 1727204532.18149: done dumping result, returning 46400 1727204532.18159: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000000731] 46400 1727204532.18172: sending task result for task 0affcd87-79f5-1303-fda8-000000000731 46400 1727204532.18284: done sending task result for task 0affcd87-79f5-1303-fda8-000000000731 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204532.18336: no more pending results, returning what we have 46400 1727204532.18341: results queue empty 46400 1727204532.18342: checking for any_errors_fatal 46400 1727204532.18349: done checking for any_errors_fatal 46400 1727204532.18350: checking for max_fail_percentage 46400 1727204532.18353: done checking for max_fail_percentage 46400 1727204532.18354: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.18355: done checking to see if all hosts have failed 46400 1727204532.18355: getting the remaining hosts for this loop 46400 1727204532.18357: done getting the remaining hosts for this loop 46400 1727204532.18362: getting the next task for host managed-node2 46400 1727204532.18374: done getting next task for host managed-node2 46400 1727204532.18379: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204532.18385: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.18405: getting variables 46400 1727204532.18407: in VariableManager get_vars() 46400 1727204532.18447: Calling all_inventory to load vars for managed-node2 46400 1727204532.18450: Calling groups_inventory to load vars for managed-node2 46400 1727204532.18453: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.18468: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.18472: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.18475: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.23403: WORKER PROCESS EXITING 46400 1727204532.23421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.24323: done with get_vars() 46400 1727204532.24341: done getting variables 46400 1727204532.24380: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.077) 0:00:22.528 ***** 46400 1727204532.24402: entering _queue_task() for managed-node2/fail 46400 1727204532.24641: worker is 1 (out of 1 available) 46400 1727204532.24654: exiting _queue_task() for managed-node2/fail 46400 1727204532.24669: done queuing things up, now waiting for results queue to drain 46400 1727204532.24670: waiting for pending results... 46400 1727204532.24866: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204532.24977: in run() - task 0affcd87-79f5-1303-fda8-000000000732 46400 1727204532.24985: variable 'ansible_search_path' from source: unknown 46400 1727204532.24989: variable 'ansible_search_path' from source: unknown 46400 1727204532.25017: calling self._execute() 46400 1727204532.25094: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.25099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.25108: variable 'omit' from source: magic vars 46400 1727204532.25412: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.25423: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.25511: variable 'network_state' from source: role '' defaults 46400 1727204532.25518: Evaluated conditional (network_state != {}): False 46400 1727204532.25521: when evaluation is False, skipping this task 46400 1727204532.25525: _execute() done 46400 1727204532.25528: dumping result to json 46400 1727204532.25530: done dumping result, returning 46400 1727204532.25558: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000000732] 46400 1727204532.25561: sending task result for task 0affcd87-79f5-1303-fda8-000000000732 46400 1727204532.25656: done sending task result for task 0affcd87-79f5-1303-fda8-000000000732 46400 1727204532.25658: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204532.25718: no more pending results, returning what we have 46400 1727204532.25722: results queue empty 46400 1727204532.25723: checking for any_errors_fatal 46400 1727204532.25733: done checking for any_errors_fatal 46400 1727204532.25734: checking for max_fail_percentage 46400 1727204532.25736: done checking for max_fail_percentage 46400 1727204532.25737: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.25738: done checking to see if all hosts have failed 46400 1727204532.25738: getting the remaining hosts for this loop 46400 1727204532.25740: done getting the remaining hosts for this loop 46400 1727204532.25743: getting the next task for host managed-node2 46400 1727204532.25751: done getting next task for host managed-node2 46400 1727204532.25755: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204532.25761: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.25784: getting variables 46400 1727204532.25786: in VariableManager get_vars() 46400 1727204532.25815: Calling all_inventory to load vars for managed-node2 46400 1727204532.25818: Calling groups_inventory to load vars for managed-node2 46400 1727204532.25820: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.25829: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.25831: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.25833: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.26710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.28342: done with get_vars() 46400 1727204532.28375: done getting variables 46400 1727204532.28430: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.040) 0:00:22.569 ***** 46400 1727204532.28470: entering _queue_task() for managed-node2/fail 46400 1727204532.28778: worker is 1 (out of 1 available) 46400 1727204532.28792: exiting _queue_task() for managed-node2/fail 46400 1727204532.28806: done queuing things up, now waiting for results queue to drain 46400 1727204532.28807: waiting for pending results... 46400 1727204532.28995: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204532.29086: in run() - task 0affcd87-79f5-1303-fda8-000000000733 46400 1727204532.29097: variable 'ansible_search_path' from source: unknown 46400 1727204532.29101: variable 'ansible_search_path' from source: unknown 46400 1727204532.29136: calling self._execute() 46400 1727204532.29207: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.29211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.29219: variable 'omit' from source: magic vars 46400 1727204532.29513: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.29522: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.29648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.31278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.31335: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.31366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.31390: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.31412: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.31471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.31491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.31510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.31540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.31550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.31623: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.31636: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204532.31640: when evaluation is False, skipping this task 46400 1727204532.31644: _execute() done 46400 1727204532.31647: dumping result to json 46400 1727204532.31649: done dumping result, returning 46400 1727204532.31656: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000000733] 46400 1727204532.31659: sending task result for task 0affcd87-79f5-1303-fda8-000000000733 46400 1727204532.31751: done sending task result for task 0affcd87-79f5-1303-fda8-000000000733 46400 1727204532.31753: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204532.31804: no more pending results, returning what we have 46400 1727204532.31808: results queue empty 46400 1727204532.31809: checking for any_errors_fatal 46400 1727204532.31815: done checking for any_errors_fatal 46400 1727204532.31815: checking for max_fail_percentage 46400 1727204532.31817: done checking for max_fail_percentage 46400 1727204532.31818: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.31819: done checking to see if all hosts have failed 46400 1727204532.31820: getting the remaining hosts for this loop 46400 1727204532.31821: done getting the remaining hosts for this loop 46400 1727204532.31825: getting the next task for host managed-node2 46400 1727204532.31834: done getting next task for host managed-node2 46400 1727204532.31838: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204532.31843: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.31906: getting variables 46400 1727204532.31909: in VariableManager get_vars() 46400 1727204532.31942: Calling all_inventory to load vars for managed-node2 46400 1727204532.31944: Calling groups_inventory to load vars for managed-node2 46400 1727204532.31946: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.31955: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.31958: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.31963: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.32825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.34394: done with get_vars() 46400 1727204532.34416: done getting variables 46400 1727204532.34463: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.060) 0:00:22.629 ***** 46400 1727204532.34492: entering _queue_task() for managed-node2/dnf 46400 1727204532.34731: worker is 1 (out of 1 available) 46400 1727204532.34746: exiting _queue_task() for managed-node2/dnf 46400 1727204532.34760: done queuing things up, now waiting for results queue to drain 46400 1727204532.34761: waiting for pending results... 46400 1727204532.34959: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204532.35060: in run() - task 0affcd87-79f5-1303-fda8-000000000734 46400 1727204532.35074: variable 'ansible_search_path' from source: unknown 46400 1727204532.35077: variable 'ansible_search_path' from source: unknown 46400 1727204532.35109: calling self._execute() 46400 1727204532.35184: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.35189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.35196: variable 'omit' from source: magic vars 46400 1727204532.35480: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.35490: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.35631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.38362: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.38422: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.38469: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.38502: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.38527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.38604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.38631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.38657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.38698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.38712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.38827: variable 'ansible_distribution' from source: facts 46400 1727204532.38831: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.38847: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204532.38968: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.39096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.39120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.39143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.39184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.39198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.39236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.39259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.39283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.39320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.39334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.39372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.39393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.39418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.39453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.39468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.39616: variable 'network_connections' from source: include params 46400 1727204532.39626: variable 'interface' from source: play vars 46400 1727204532.39693: variable 'interface' from source: play vars 46400 1727204532.39762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204532.39919: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204532.39963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204532.39991: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204532.40031: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204532.40074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204532.40096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204532.40120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.40144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204532.40200: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204532.40429: variable 'network_connections' from source: include params 46400 1727204532.40432: variable 'interface' from source: play vars 46400 1727204532.40495: variable 'interface' from source: play vars 46400 1727204532.40525: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204532.40528: when evaluation is False, skipping this task 46400 1727204532.40531: _execute() done 46400 1727204532.40533: dumping result to json 46400 1727204532.40535: done dumping result, returning 46400 1727204532.40544: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000734] 46400 1727204532.40549: sending task result for task 0affcd87-79f5-1303-fda8-000000000734 46400 1727204532.40649: done sending task result for task 0affcd87-79f5-1303-fda8-000000000734 46400 1727204532.40651: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204532.40702: no more pending results, returning what we have 46400 1727204532.40706: results queue empty 46400 1727204532.40707: checking for any_errors_fatal 46400 1727204532.40713: done checking for any_errors_fatal 46400 1727204532.40714: checking for max_fail_percentage 46400 1727204532.40715: done checking for max_fail_percentage 46400 1727204532.40716: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.40717: done checking to see if all hosts have failed 46400 1727204532.40717: getting the remaining hosts for this loop 46400 1727204532.40719: done getting the remaining hosts for this loop 46400 1727204532.40723: getting the next task for host managed-node2 46400 1727204532.40731: done getting next task for host managed-node2 46400 1727204532.40735: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204532.40740: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.40755: getting variables 46400 1727204532.40756: in VariableManager get_vars() 46400 1727204532.40802: Calling all_inventory to load vars for managed-node2 46400 1727204532.40805: Calling groups_inventory to load vars for managed-node2 46400 1727204532.40807: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.40816: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.40819: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.40821: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.42476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.44283: done with get_vars() 46400 1727204532.44311: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204532.44414: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.099) 0:00:22.729 ***** 46400 1727204532.44444: entering _queue_task() for managed-node2/yum 46400 1727204532.44791: worker is 1 (out of 1 available) 46400 1727204532.44822: exiting _queue_task() for managed-node2/yum 46400 1727204532.44837: done queuing things up, now waiting for results queue to drain 46400 1727204532.44839: waiting for pending results... 46400 1727204532.45168: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204532.45317: in run() - task 0affcd87-79f5-1303-fda8-000000000735 46400 1727204532.45358: variable 'ansible_search_path' from source: unknown 46400 1727204532.45374: variable 'ansible_search_path' from source: unknown 46400 1727204532.45417: calling self._execute() 46400 1727204532.45521: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.45533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.45544: variable 'omit' from source: magic vars 46400 1727204532.45981: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.46001: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.46382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.49389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.49488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.49626: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.49671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.49824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.49928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.50050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.50087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.50262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.50288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.50503: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.50524: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204532.50532: when evaluation is False, skipping this task 46400 1727204532.50539: _execute() done 46400 1727204532.50545: dumping result to json 46400 1727204532.50555: done dumping result, returning 46400 1727204532.50578: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000735] 46400 1727204532.50682: sending task result for task 0affcd87-79f5-1303-fda8-000000000735 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204532.50853: no more pending results, returning what we have 46400 1727204532.50858: results queue empty 46400 1727204532.50859: checking for any_errors_fatal 46400 1727204532.50869: done checking for any_errors_fatal 46400 1727204532.50870: checking for max_fail_percentage 46400 1727204532.50873: done checking for max_fail_percentage 46400 1727204532.50874: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.50874: done checking to see if all hosts have failed 46400 1727204532.50875: getting the remaining hosts for this loop 46400 1727204532.50877: done getting the remaining hosts for this loop 46400 1727204532.50882: getting the next task for host managed-node2 46400 1727204532.50891: done getting next task for host managed-node2 46400 1727204532.50896: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204532.50901: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.50922: getting variables 46400 1727204532.50923: in VariableManager get_vars() 46400 1727204532.50968: Calling all_inventory to load vars for managed-node2 46400 1727204532.50971: Calling groups_inventory to load vars for managed-node2 46400 1727204532.50973: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.50984: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.50987: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.50990: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.52015: done sending task result for task 0affcd87-79f5-1303-fda8-000000000735 46400 1727204532.52019: WORKER PROCESS EXITING 46400 1727204532.53244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.55296: done with get_vars() 46400 1727204532.55326: done getting variables 46400 1727204532.55395: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.109) 0:00:22.838 ***** 46400 1727204532.55437: entering _queue_task() for managed-node2/fail 46400 1727204532.55786: worker is 1 (out of 1 available) 46400 1727204532.55799: exiting _queue_task() for managed-node2/fail 46400 1727204532.55812: done queuing things up, now waiting for results queue to drain 46400 1727204532.55814: waiting for pending results... 46400 1727204532.56195: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204532.56363: in run() - task 0affcd87-79f5-1303-fda8-000000000736 46400 1727204532.56387: variable 'ansible_search_path' from source: unknown 46400 1727204532.56399: variable 'ansible_search_path' from source: unknown 46400 1727204532.56440: calling self._execute() 46400 1727204532.56548: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.56566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.56580: variable 'omit' from source: magic vars 46400 1727204532.56988: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.57005: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.57149: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.57368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.60567: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.60642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.60689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.60722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.60750: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.60833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.60866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.60899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.60939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.60954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.61010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.61033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.61057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.61104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.61118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.61158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.61184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.61217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.61256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.61272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.61465: variable 'network_connections' from source: include params 46400 1727204532.61475: variable 'interface' from source: play vars 46400 1727204532.61662: variable 'interface' from source: play vars 46400 1727204532.61730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204532.61921: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204532.61967: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204532.62008: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204532.62036: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204532.62091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204532.62114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204532.62139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.62167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204532.62299: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204532.62562: variable 'network_connections' from source: include params 46400 1727204532.62568: variable 'interface' from source: play vars 46400 1727204532.62636: variable 'interface' from source: play vars 46400 1727204532.62673: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204532.62677: when evaluation is False, skipping this task 46400 1727204532.62679: _execute() done 46400 1727204532.62681: dumping result to json 46400 1727204532.62683: done dumping result, returning 46400 1727204532.62688: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000736] 46400 1727204532.62694: sending task result for task 0affcd87-79f5-1303-fda8-000000000736 46400 1727204532.62799: done sending task result for task 0affcd87-79f5-1303-fda8-000000000736 46400 1727204532.62802: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204532.62856: no more pending results, returning what we have 46400 1727204532.62862: results queue empty 46400 1727204532.62865: checking for any_errors_fatal 46400 1727204532.62870: done checking for any_errors_fatal 46400 1727204532.62871: checking for max_fail_percentage 46400 1727204532.62872: done checking for max_fail_percentage 46400 1727204532.62873: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.62874: done checking to see if all hosts have failed 46400 1727204532.62874: getting the remaining hosts for this loop 46400 1727204532.62876: done getting the remaining hosts for this loop 46400 1727204532.62880: getting the next task for host managed-node2 46400 1727204532.62888: done getting next task for host managed-node2 46400 1727204532.62892: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204532.62896: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.62912: getting variables 46400 1727204532.62913: in VariableManager get_vars() 46400 1727204532.62950: Calling all_inventory to load vars for managed-node2 46400 1727204532.62953: Calling groups_inventory to load vars for managed-node2 46400 1727204532.62955: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.62969: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.62971: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.62974: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.64566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.65661: done with get_vars() 46400 1727204532.65681: done getting variables 46400 1727204532.65726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.103) 0:00:22.942 ***** 46400 1727204532.65753: entering _queue_task() for managed-node2/package 46400 1727204532.65984: worker is 1 (out of 1 available) 46400 1727204532.65999: exiting _queue_task() for managed-node2/package 46400 1727204532.66012: done queuing things up, now waiting for results queue to drain 46400 1727204532.66014: waiting for pending results... 46400 1727204532.66209: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204532.66304: in run() - task 0affcd87-79f5-1303-fda8-000000000737 46400 1727204532.66313: variable 'ansible_search_path' from source: unknown 46400 1727204532.66317: variable 'ansible_search_path' from source: unknown 46400 1727204532.66351: calling self._execute() 46400 1727204532.66439: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.66466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.66486: variable 'omit' from source: magic vars 46400 1727204532.66898: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.66918: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.67131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204532.67416: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204532.67479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204532.67518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204532.67612: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204532.67743: variable 'network_packages' from source: role '' defaults 46400 1727204532.67840: variable '__network_provider_setup' from source: role '' defaults 46400 1727204532.67849: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204532.67908: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204532.67914: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204532.67958: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204532.68082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.69498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.69540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.69568: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.69617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.69627: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.69733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.69740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.69767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.69805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.69817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.69873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.70089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.70092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.70095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.70097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.70170: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204532.70273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.70296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.70321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.70358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.70372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.70463: variable 'ansible_python' from source: facts 46400 1727204532.70477: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204532.70556: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204532.70633: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204532.70750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.70773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.70798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.70836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.70851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.70897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.70920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.70945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.70982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.70996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.71142: variable 'network_connections' from source: include params 46400 1727204532.71148: variable 'interface' from source: play vars 46400 1727204532.71245: variable 'interface' from source: play vars 46400 1727204532.71315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204532.71340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204532.71371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.71402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204532.71446: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.71728: variable 'network_connections' from source: include params 46400 1727204532.71731: variable 'interface' from source: play vars 46400 1727204532.71821: variable 'interface' from source: play vars 46400 1727204532.71871: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204532.71947: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.72220: variable 'network_connections' from source: include params 46400 1727204532.72224: variable 'interface' from source: play vars 46400 1727204532.72281: variable 'interface' from source: play vars 46400 1727204532.72301: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204532.72355: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204532.72551: variable 'network_connections' from source: include params 46400 1727204532.72555: variable 'interface' from source: play vars 46400 1727204532.72604: variable 'interface' from source: play vars 46400 1727204532.72648: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204532.72691: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204532.72697: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204532.72742: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204532.72880: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204532.73179: variable 'network_connections' from source: include params 46400 1727204532.73182: variable 'interface' from source: play vars 46400 1727204532.73225: variable 'interface' from source: play vars 46400 1727204532.73232: variable 'ansible_distribution' from source: facts 46400 1727204532.73235: variable '__network_rh_distros' from source: role '' defaults 46400 1727204532.73243: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.73267: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204532.73375: variable 'ansible_distribution' from source: facts 46400 1727204532.73379: variable '__network_rh_distros' from source: role '' defaults 46400 1727204532.73382: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.73390: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204532.73497: variable 'ansible_distribution' from source: facts 46400 1727204532.73500: variable '__network_rh_distros' from source: role '' defaults 46400 1727204532.73505: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.73531: variable 'network_provider' from source: set_fact 46400 1727204532.73542: variable 'ansible_facts' from source: unknown 46400 1727204532.74589: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204532.74592: when evaluation is False, skipping this task 46400 1727204532.74594: _execute() done 46400 1727204532.74596: dumping result to json 46400 1727204532.74598: done dumping result, returning 46400 1727204532.74600: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000000737] 46400 1727204532.74602: sending task result for task 0affcd87-79f5-1303-fda8-000000000737 46400 1727204532.74672: done sending task result for task 0affcd87-79f5-1303-fda8-000000000737 46400 1727204532.74676: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204532.74728: no more pending results, returning what we have 46400 1727204532.74731: results queue empty 46400 1727204532.74732: checking for any_errors_fatal 46400 1727204532.74738: done checking for any_errors_fatal 46400 1727204532.74738: checking for max_fail_percentage 46400 1727204532.74740: done checking for max_fail_percentage 46400 1727204532.74741: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.74742: done checking to see if all hosts have failed 46400 1727204532.74743: getting the remaining hosts for this loop 46400 1727204532.74744: done getting the remaining hosts for this loop 46400 1727204532.74748: getting the next task for host managed-node2 46400 1727204532.74754: done getting next task for host managed-node2 46400 1727204532.74758: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204532.74780: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.74802: getting variables 46400 1727204532.74804: in VariableManager get_vars() 46400 1727204532.74843: Calling all_inventory to load vars for managed-node2 46400 1727204532.74846: Calling groups_inventory to load vars for managed-node2 46400 1727204532.74848: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.74857: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.74862: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.74866: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.76233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.77433: done with get_vars() 46400 1727204532.77450: done getting variables 46400 1727204532.77500: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.117) 0:00:23.059 ***** 46400 1727204532.77527: entering _queue_task() for managed-node2/package 46400 1727204532.77769: worker is 1 (out of 1 available) 46400 1727204532.77783: exiting _queue_task() for managed-node2/package 46400 1727204532.77797: done queuing things up, now waiting for results queue to drain 46400 1727204532.77798: waiting for pending results... 46400 1727204532.77990: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204532.78097: in run() - task 0affcd87-79f5-1303-fda8-000000000738 46400 1727204532.78109: variable 'ansible_search_path' from source: unknown 46400 1727204532.78112: variable 'ansible_search_path' from source: unknown 46400 1727204532.78140: calling self._execute() 46400 1727204532.78224: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.78244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.78339: variable 'omit' from source: magic vars 46400 1727204532.78702: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.78720: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.78852: variable 'network_state' from source: role '' defaults 46400 1727204532.78870: Evaluated conditional (network_state != {}): False 46400 1727204532.78883: when evaluation is False, skipping this task 46400 1727204532.78896: _execute() done 46400 1727204532.78903: dumping result to json 46400 1727204532.78910: done dumping result, returning 46400 1727204532.78920: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000738] 46400 1727204532.78929: sending task result for task 0affcd87-79f5-1303-fda8-000000000738 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204532.79089: no more pending results, returning what we have 46400 1727204532.79095: results queue empty 46400 1727204532.79096: checking for any_errors_fatal 46400 1727204532.79103: done checking for any_errors_fatal 46400 1727204532.79104: checking for max_fail_percentage 46400 1727204532.79106: done checking for max_fail_percentage 46400 1727204532.79107: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.79108: done checking to see if all hosts have failed 46400 1727204532.79109: getting the remaining hosts for this loop 46400 1727204532.79112: done getting the remaining hosts for this loop 46400 1727204532.79116: getting the next task for host managed-node2 46400 1727204532.79126: done getting next task for host managed-node2 46400 1727204532.79131: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204532.79137: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.79161: getting variables 46400 1727204532.79166: in VariableManager get_vars() 46400 1727204532.79205: Calling all_inventory to load vars for managed-node2 46400 1727204532.79208: Calling groups_inventory to load vars for managed-node2 46400 1727204532.79211: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.79224: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.79227: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.79231: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.80003: done sending task result for task 0affcd87-79f5-1303-fda8-000000000738 46400 1727204532.80007: WORKER PROCESS EXITING 46400 1727204532.80305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.81350: done with get_vars() 46400 1727204532.81377: done getting variables 46400 1727204532.81441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.039) 0:00:23.099 ***** 46400 1727204532.81482: entering _queue_task() for managed-node2/package 46400 1727204532.81825: worker is 1 (out of 1 available) 46400 1727204532.81838: exiting _queue_task() for managed-node2/package 46400 1727204532.81852: done queuing things up, now waiting for results queue to drain 46400 1727204532.81854: waiting for pending results... 46400 1727204532.82312: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204532.82435: in run() - task 0affcd87-79f5-1303-fda8-000000000739 46400 1727204532.82448: variable 'ansible_search_path' from source: unknown 46400 1727204532.82451: variable 'ansible_search_path' from source: unknown 46400 1727204532.82489: calling self._execute() 46400 1727204532.82699: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.82706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.82718: variable 'omit' from source: magic vars 46400 1727204532.83199: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.83211: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.83344: variable 'network_state' from source: role '' defaults 46400 1727204532.83353: Evaluated conditional (network_state != {}): False 46400 1727204532.83356: when evaluation is False, skipping this task 46400 1727204532.83362: _execute() done 46400 1727204532.83367: dumping result to json 46400 1727204532.83370: done dumping result, returning 46400 1727204532.83374: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000739] 46400 1727204532.83380: sending task result for task 0affcd87-79f5-1303-fda8-000000000739 46400 1727204532.83494: done sending task result for task 0affcd87-79f5-1303-fda8-000000000739 46400 1727204532.83497: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204532.83546: no more pending results, returning what we have 46400 1727204532.83552: results queue empty 46400 1727204532.83553: checking for any_errors_fatal 46400 1727204532.83561: done checking for any_errors_fatal 46400 1727204532.83562: checking for max_fail_percentage 46400 1727204532.83567: done checking for max_fail_percentage 46400 1727204532.83569: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.83569: done checking to see if all hosts have failed 46400 1727204532.83570: getting the remaining hosts for this loop 46400 1727204532.83572: done getting the remaining hosts for this loop 46400 1727204532.83576: getting the next task for host managed-node2 46400 1727204532.83586: done getting next task for host managed-node2 46400 1727204532.83590: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204532.83596: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.83615: getting variables 46400 1727204532.83617: in VariableManager get_vars() 46400 1727204532.83657: Calling all_inventory to load vars for managed-node2 46400 1727204532.83660: Calling groups_inventory to load vars for managed-node2 46400 1727204532.83663: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.83680: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.83683: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.83686: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.85032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.85943: done with get_vars() 46400 1727204532.85965: done getting variables 46400 1727204532.86010: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.045) 0:00:23.144 ***** 46400 1727204532.86037: entering _queue_task() for managed-node2/service 46400 1727204532.86274: worker is 1 (out of 1 available) 46400 1727204532.86287: exiting _queue_task() for managed-node2/service 46400 1727204532.86299: done queuing things up, now waiting for results queue to drain 46400 1727204532.86301: waiting for pending results... 46400 1727204532.86488: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204532.86590: in run() - task 0affcd87-79f5-1303-fda8-00000000073a 46400 1727204532.86599: variable 'ansible_search_path' from source: unknown 46400 1727204532.86602: variable 'ansible_search_path' from source: unknown 46400 1727204532.86636: calling self._execute() 46400 1727204532.86702: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.86705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.86713: variable 'omit' from source: magic vars 46400 1727204532.86992: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.87002: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.87086: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.87219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.88803: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.88856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.88886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.88914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.88935: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.88995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.89014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.89041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.89072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.89083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.89116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.89138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.89154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.89186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.89196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.89226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.89246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.89262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.89291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.89301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.89421: variable 'network_connections' from source: include params 46400 1727204532.89430: variable 'interface' from source: play vars 46400 1727204532.89485: variable 'interface' from source: play vars 46400 1727204532.89533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204532.89646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204532.89688: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204532.89709: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204532.89731: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204532.89762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204532.89783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204532.89804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.89822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204532.89871: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204532.90030: variable 'network_connections' from source: include params 46400 1727204532.90033: variable 'interface' from source: play vars 46400 1727204532.90084: variable 'interface' from source: play vars 46400 1727204532.90108: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204532.90113: when evaluation is False, skipping this task 46400 1727204532.90116: _execute() done 46400 1727204532.90118: dumping result to json 46400 1727204532.90120: done dumping result, returning 46400 1727204532.90124: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000073a] 46400 1727204532.90132: sending task result for task 0affcd87-79f5-1303-fda8-00000000073a 46400 1727204532.90223: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073a 46400 1727204532.90231: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204532.90284: no more pending results, returning what we have 46400 1727204532.90288: results queue empty 46400 1727204532.90289: checking for any_errors_fatal 46400 1727204532.90297: done checking for any_errors_fatal 46400 1727204532.90297: checking for max_fail_percentage 46400 1727204532.90299: done checking for max_fail_percentage 46400 1727204532.90300: checking to see if all hosts have failed and the running result is not ok 46400 1727204532.90301: done checking to see if all hosts have failed 46400 1727204532.90302: getting the remaining hosts for this loop 46400 1727204532.90303: done getting the remaining hosts for this loop 46400 1727204532.90307: getting the next task for host managed-node2 46400 1727204532.90316: done getting next task for host managed-node2 46400 1727204532.90324: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204532.90329: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204532.90350: getting variables 46400 1727204532.90352: in VariableManager get_vars() 46400 1727204532.90386: Calling all_inventory to load vars for managed-node2 46400 1727204532.90389: Calling groups_inventory to load vars for managed-node2 46400 1727204532.90391: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204532.90400: Calling all_plugins_play to load vars for managed-node2 46400 1727204532.90402: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204532.90404: Calling groups_plugins_play to load vars for managed-node2 46400 1727204532.91218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204532.92142: done with get_vars() 46400 1727204532.92158: done getting variables 46400 1727204532.92204: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.061) 0:00:23.206 ***** 46400 1727204532.92228: entering _queue_task() for managed-node2/service 46400 1727204532.92448: worker is 1 (out of 1 available) 46400 1727204532.92463: exiting _queue_task() for managed-node2/service 46400 1727204532.92477: done queuing things up, now waiting for results queue to drain 46400 1727204532.92479: waiting for pending results... 46400 1727204532.92654: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204532.92748: in run() - task 0affcd87-79f5-1303-fda8-00000000073b 46400 1727204532.92758: variable 'ansible_search_path' from source: unknown 46400 1727204532.92762: variable 'ansible_search_path' from source: unknown 46400 1727204532.92793: calling self._execute() 46400 1727204532.92862: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204532.92871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204532.92879: variable 'omit' from source: magic vars 46400 1727204532.93145: variable 'ansible_distribution_major_version' from source: facts 46400 1727204532.93155: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204532.93289: variable 'network_provider' from source: set_fact 46400 1727204532.93300: variable 'network_state' from source: role '' defaults 46400 1727204532.93313: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204532.93318: variable 'omit' from source: magic vars 46400 1727204532.93356: variable 'omit' from source: magic vars 46400 1727204532.93385: variable 'network_service_name' from source: role '' defaults 46400 1727204532.93431: variable 'network_service_name' from source: role '' defaults 46400 1727204532.93514: variable '__network_provider_setup' from source: role '' defaults 46400 1727204532.93517: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204532.93562: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204532.93573: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204532.93619: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204532.93769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204532.96091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204532.96143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204532.96171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204532.96200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204532.96218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204532.96277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.96297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.96318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.96345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.96355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.96389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.96405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.96425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.96451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.96469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.96611: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204532.96692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.96709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.96725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.96754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.96767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.96829: variable 'ansible_python' from source: facts 46400 1727204532.96841: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204532.96903: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204532.96964: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204532.97041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.97058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.97082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.97106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.97117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.97149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204532.97170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204532.97191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.97215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204532.97226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204532.97345: variable 'network_connections' from source: include params 46400 1727204532.97352: variable 'interface' from source: play vars 46400 1727204532.97471: variable 'interface' from source: play vars 46400 1727204532.97936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204532.98126: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204532.98175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204532.98215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204532.98253: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204532.98312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204532.98340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204532.98371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204532.98403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204532.98450: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.98717: variable 'network_connections' from source: include params 46400 1727204532.98723: variable 'interface' from source: play vars 46400 1727204532.98797: variable 'interface' from source: play vars 46400 1727204532.98844: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204532.98923: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204532.99201: variable 'network_connections' from source: include params 46400 1727204532.99204: variable 'interface' from source: play vars 46400 1727204532.99275: variable 'interface' from source: play vars 46400 1727204532.99299: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204532.99375: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204532.99651: variable 'network_connections' from source: include params 46400 1727204532.99654: variable 'interface' from source: play vars 46400 1727204532.99726: variable 'interface' from source: play vars 46400 1727204532.99787: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204532.99844: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204532.99850: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204532.99911: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204533.00121: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204533.00601: variable 'network_connections' from source: include params 46400 1727204533.00606: variable 'interface' from source: play vars 46400 1727204533.00667: variable 'interface' from source: play vars 46400 1727204533.00675: variable 'ansible_distribution' from source: facts 46400 1727204533.00678: variable '__network_rh_distros' from source: role '' defaults 46400 1727204533.00684: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.00712: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204533.00882: variable 'ansible_distribution' from source: facts 46400 1727204533.00885: variable '__network_rh_distros' from source: role '' defaults 46400 1727204533.00890: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.00900: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204533.01066: variable 'ansible_distribution' from source: facts 46400 1727204533.01070: variable '__network_rh_distros' from source: role '' defaults 46400 1727204533.01072: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.01106: variable 'network_provider' from source: set_fact 46400 1727204533.01127: variable 'omit' from source: magic vars 46400 1727204533.01155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204533.01183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204533.01201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204533.01217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204533.01227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204533.01257: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204533.01262: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.01268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.01355: Set connection var ansible_shell_type to sh 46400 1727204533.01367: Set connection var ansible_shell_executable to /bin/sh 46400 1727204533.01373: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204533.01378: Set connection var ansible_connection to ssh 46400 1727204533.01383: Set connection var ansible_pipelining to False 46400 1727204533.01389: Set connection var ansible_timeout to 10 46400 1727204533.01416: variable 'ansible_shell_executable' from source: unknown 46400 1727204533.01419: variable 'ansible_connection' from source: unknown 46400 1727204533.01421: variable 'ansible_module_compression' from source: unknown 46400 1727204533.01423: variable 'ansible_shell_type' from source: unknown 46400 1727204533.01425: variable 'ansible_shell_executable' from source: unknown 46400 1727204533.01427: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.01432: variable 'ansible_pipelining' from source: unknown 46400 1727204533.01435: variable 'ansible_timeout' from source: unknown 46400 1727204533.01439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.01544: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204533.01555: variable 'omit' from source: magic vars 46400 1727204533.01566: starting attempt loop 46400 1727204533.01569: running the handler 46400 1727204533.01642: variable 'ansible_facts' from source: unknown 46400 1727204533.02339: _low_level_execute_command(): starting 46400 1727204533.02346: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204533.03072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204533.03085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.03097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.03110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.03150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.03157: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204533.03170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.03184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204533.03189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204533.03196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204533.03205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.03213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.03224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.03231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.03238: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204533.03248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.03322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.03341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.03354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.03449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.05122: stdout chunk (state=3): >>>/root <<< 46400 1727204533.05282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.05303: stderr chunk (state=3): >>><<< 46400 1727204533.05306: stdout chunk (state=3): >>><<< 46400 1727204533.05330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.05344: _low_level_execute_command(): starting 46400 1727204533.05347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331 `" && echo ansible-tmp-1727204533.053303-48170-153084131420331="` echo /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331 `" ) && sleep 0' 46400 1727204533.05907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.05911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.05942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.05946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.05948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.05998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.06005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.06062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.07937: stdout chunk (state=3): >>>ansible-tmp-1727204533.053303-48170-153084131420331=/root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331 <<< 46400 1727204533.08086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.08129: stderr chunk (state=3): >>><<< 46400 1727204533.08133: stdout chunk (state=3): >>><<< 46400 1727204533.08174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204533.053303-48170-153084131420331=/root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.08197: variable 'ansible_module_compression' from source: unknown 46400 1727204533.08236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204533.08290: variable 'ansible_facts' from source: unknown 46400 1727204533.08418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/AnsiballZ_systemd.py 46400 1727204533.08532: Sending initial data 46400 1727204533.08535: Sent initial data (155 bytes) 46400 1727204533.09201: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204533.09207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.09221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.09250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.09258: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204533.09269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.09279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204533.09286: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.09293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.09303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.09306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204533.09314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.09371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.09378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.09381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.09443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.11183: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204533.11214: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204533.11259: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp6vyr9say /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/AnsiballZ_systemd.py <<< 46400 1727204533.11295: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204533.13586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.13691: stderr chunk (state=3): >>><<< 46400 1727204533.13695: stdout chunk (state=3): >>><<< 46400 1727204533.13711: done transferring module to remote 46400 1727204533.13720: _low_level_execute_command(): starting 46400 1727204533.13725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/ /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/AnsiballZ_systemd.py && sleep 0' 46400 1727204533.14863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204533.14878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.14884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.14887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.14889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.14894: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204533.14896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.14898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204533.14900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204533.14902: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204533.14904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.14906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.14907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.14909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.14911: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204533.14913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.14918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.14920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.14931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.14982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.16755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.16804: stderr chunk (state=3): >>><<< 46400 1727204533.16808: stdout chunk (state=3): >>><<< 46400 1727204533.16822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.16825: _low_level_execute_command(): starting 46400 1727204533.16829: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/AnsiballZ_systemd.py && sleep 0' 46400 1727204533.17262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.17280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.17319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.17322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.17325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.17386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.17389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.17392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.17442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.42912: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "2029432000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdog<<< 46400 1727204533.42931: stdout chunk (state=3): >>>Signal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204533.44474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204533.44479: stdout chunk (state=3): >>><<< 46400 1727204533.44484: stderr chunk (state=3): >>><<< 46400 1727204533.44498: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "2029432000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204533.44613: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204533.44627: _low_level_execute_command(): starting 46400 1727204533.44632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204533.053303-48170-153084131420331/ > /dev/null 2>&1 && sleep 0' 46400 1727204533.45093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.45136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204533.45149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.45201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.45207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.45219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.45272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.47149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.47154: stderr chunk (state=3): >>><<< 46400 1727204533.47159: stdout chunk (state=3): >>><<< 46400 1727204533.47180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.47196: handler run complete 46400 1727204533.47258: attempt loop complete, returning result 46400 1727204533.47266: _execute() done 46400 1727204533.47268: dumping result to json 46400 1727204533.47283: done dumping result, returning 46400 1727204533.47300: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-00000000073b] 46400 1727204533.47306: sending task result for task 0affcd87-79f5-1303-fda8-00000000073b ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204533.47665: no more pending results, returning what we have 46400 1727204533.47670: results queue empty 46400 1727204533.47671: checking for any_errors_fatal 46400 1727204533.47680: done checking for any_errors_fatal 46400 1727204533.47681: checking for max_fail_percentage 46400 1727204533.47682: done checking for max_fail_percentage 46400 1727204533.47683: checking to see if all hosts have failed and the running result is not ok 46400 1727204533.47684: done checking to see if all hosts have failed 46400 1727204533.47685: getting the remaining hosts for this loop 46400 1727204533.47686: done getting the remaining hosts for this loop 46400 1727204533.47691: getting the next task for host managed-node2 46400 1727204533.47698: done getting next task for host managed-node2 46400 1727204533.47704: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204533.47709: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204533.47721: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073b 46400 1727204533.47727: WORKER PROCESS EXITING 46400 1727204533.47733: getting variables 46400 1727204533.47735: in VariableManager get_vars() 46400 1727204533.47771: Calling all_inventory to load vars for managed-node2 46400 1727204533.47774: Calling groups_inventory to load vars for managed-node2 46400 1727204533.47776: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204533.47786: Calling all_plugins_play to load vars for managed-node2 46400 1727204533.47789: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204533.47791: Calling groups_plugins_play to load vars for managed-node2 46400 1727204533.49553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204533.51273: done with get_vars() 46400 1727204533.51309: done getting variables 46400 1727204533.51376: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.591) 0:00:23.798 ***** 46400 1727204533.51425: entering _queue_task() for managed-node2/service 46400 1727204533.51777: worker is 1 (out of 1 available) 46400 1727204533.51791: exiting _queue_task() for managed-node2/service 46400 1727204533.51804: done queuing things up, now waiting for results queue to drain 46400 1727204533.51805: waiting for pending results... 46400 1727204533.52120: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204533.52272: in run() - task 0affcd87-79f5-1303-fda8-00000000073c 46400 1727204533.52290: variable 'ansible_search_path' from source: unknown 46400 1727204533.52294: variable 'ansible_search_path' from source: unknown 46400 1727204533.52330: calling self._execute() 46400 1727204533.52431: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.52437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.52447: variable 'omit' from source: magic vars 46400 1727204533.52835: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.52848: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204533.52971: variable 'network_provider' from source: set_fact 46400 1727204533.52977: Evaluated conditional (network_provider == "nm"): True 46400 1727204533.53075: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204533.53168: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204533.53344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204533.55611: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204533.55694: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204533.55739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204533.55791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204533.55823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204533.55925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204533.55965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204533.55997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204533.56046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204533.56073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204533.56125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204533.56153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204533.56189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204533.56235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204533.56255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204533.56305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204533.56336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204533.56372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204533.56417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204533.56437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204533.56602: variable 'network_connections' from source: include params 46400 1727204533.56620: variable 'interface' from source: play vars 46400 1727204533.56701: variable 'interface' from source: play vars 46400 1727204533.56781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204533.56972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204533.57014: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204533.57052: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204533.57092: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204533.57141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204533.57197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204533.57231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204533.57266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204533.57322: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204533.57591: variable 'network_connections' from source: include params 46400 1727204533.57602: variable 'interface' from source: play vars 46400 1727204533.57676: variable 'interface' from source: play vars 46400 1727204533.57725: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204533.57733: when evaluation is False, skipping this task 46400 1727204533.57740: _execute() done 46400 1727204533.57747: dumping result to json 46400 1727204533.57754: done dumping result, returning 46400 1727204533.57773: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-00000000073c] 46400 1727204533.57793: sending task result for task 0affcd87-79f5-1303-fda8-00000000073c skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204533.57946: no more pending results, returning what we have 46400 1727204533.57950: results queue empty 46400 1727204533.57951: checking for any_errors_fatal 46400 1727204533.57976: done checking for any_errors_fatal 46400 1727204533.57977: checking for max_fail_percentage 46400 1727204533.57979: done checking for max_fail_percentage 46400 1727204533.57980: checking to see if all hosts have failed and the running result is not ok 46400 1727204533.57981: done checking to see if all hosts have failed 46400 1727204533.57981: getting the remaining hosts for this loop 46400 1727204533.57983: done getting the remaining hosts for this loop 46400 1727204533.57987: getting the next task for host managed-node2 46400 1727204533.57996: done getting next task for host managed-node2 46400 1727204533.58000: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204533.58004: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204533.58025: getting variables 46400 1727204533.58027: in VariableManager get_vars() 46400 1727204533.58063: Calling all_inventory to load vars for managed-node2 46400 1727204533.58071: Calling groups_inventory to load vars for managed-node2 46400 1727204533.58073: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204533.58084: Calling all_plugins_play to load vars for managed-node2 46400 1727204533.58086: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204533.58090: Calling groups_plugins_play to load vars for managed-node2 46400 1727204533.58624: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073c 46400 1727204533.58627: WORKER PROCESS EXITING 46400 1727204533.59686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204533.61314: done with get_vars() 46400 1727204533.61345: done getting variables 46400 1727204533.61415: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.100) 0:00:23.899 ***** 46400 1727204533.61452: entering _queue_task() for managed-node2/service 46400 1727204533.61791: worker is 1 (out of 1 available) 46400 1727204533.61805: exiting _queue_task() for managed-node2/service 46400 1727204533.61818: done queuing things up, now waiting for results queue to drain 46400 1727204533.61820: waiting for pending results... 46400 1727204533.62121: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204533.62257: in run() - task 0affcd87-79f5-1303-fda8-00000000073d 46400 1727204533.62279: variable 'ansible_search_path' from source: unknown 46400 1727204533.62284: variable 'ansible_search_path' from source: unknown 46400 1727204533.62320: calling self._execute() 46400 1727204533.62415: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.62419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.62428: variable 'omit' from source: magic vars 46400 1727204533.62796: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.62813: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204533.62912: variable 'network_provider' from source: set_fact 46400 1727204533.62923: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204533.62926: when evaluation is False, skipping this task 46400 1727204533.62929: _execute() done 46400 1727204533.62932: dumping result to json 46400 1727204533.62934: done dumping result, returning 46400 1727204533.62940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-00000000073d] 46400 1727204533.62945: sending task result for task 0affcd87-79f5-1303-fda8-00000000073d 46400 1727204533.63038: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073d 46400 1727204533.63040: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204533.63091: no more pending results, returning what we have 46400 1727204533.63096: results queue empty 46400 1727204533.63097: checking for any_errors_fatal 46400 1727204533.63105: done checking for any_errors_fatal 46400 1727204533.63106: checking for max_fail_percentage 46400 1727204533.63107: done checking for max_fail_percentage 46400 1727204533.63108: checking to see if all hosts have failed and the running result is not ok 46400 1727204533.63109: done checking to see if all hosts have failed 46400 1727204533.63110: getting the remaining hosts for this loop 46400 1727204533.63112: done getting the remaining hosts for this loop 46400 1727204533.63115: getting the next task for host managed-node2 46400 1727204533.63124: done getting next task for host managed-node2 46400 1727204533.63128: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204533.63134: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204533.63153: getting variables 46400 1727204533.63154: in VariableManager get_vars() 46400 1727204533.63190: Calling all_inventory to load vars for managed-node2 46400 1727204533.63193: Calling groups_inventory to load vars for managed-node2 46400 1727204533.63195: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204533.63203: Calling all_plugins_play to load vars for managed-node2 46400 1727204533.63205: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204533.63207: Calling groups_plugins_play to load vars for managed-node2 46400 1727204533.64030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204533.65149: done with get_vars() 46400 1727204533.65168: done getting variables 46400 1727204533.65212: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.037) 0:00:23.936 ***** 46400 1727204533.65238: entering _queue_task() for managed-node2/copy 46400 1727204533.65461: worker is 1 (out of 1 available) 46400 1727204533.65476: exiting _queue_task() for managed-node2/copy 46400 1727204533.65490: done queuing things up, now waiting for results queue to drain 46400 1727204533.65491: waiting for pending results... 46400 1727204533.65674: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204533.65833: in run() - task 0affcd87-79f5-1303-fda8-00000000073e 46400 1727204533.65851: variable 'ansible_search_path' from source: unknown 46400 1727204533.65858: variable 'ansible_search_path' from source: unknown 46400 1727204533.65894: calling self._execute() 46400 1727204533.65986: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.65998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.66010: variable 'omit' from source: magic vars 46400 1727204533.66399: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.66414: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204533.66527: variable 'network_provider' from source: set_fact 46400 1727204533.66541: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204533.66546: when evaluation is False, skipping this task 46400 1727204533.66551: _execute() done 46400 1727204533.66557: dumping result to json 46400 1727204533.66569: done dumping result, returning 46400 1727204533.66580: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-00000000073e] 46400 1727204533.66589: sending task result for task 0affcd87-79f5-1303-fda8-00000000073e skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204533.66719: no more pending results, returning what we have 46400 1727204533.66723: results queue empty 46400 1727204533.66724: checking for any_errors_fatal 46400 1727204533.66731: done checking for any_errors_fatal 46400 1727204533.66732: checking for max_fail_percentage 46400 1727204533.66733: done checking for max_fail_percentage 46400 1727204533.66734: checking to see if all hosts have failed and the running result is not ok 46400 1727204533.66735: done checking to see if all hosts have failed 46400 1727204533.66736: getting the remaining hosts for this loop 46400 1727204533.66737: done getting the remaining hosts for this loop 46400 1727204533.66741: getting the next task for host managed-node2 46400 1727204533.66751: done getting next task for host managed-node2 46400 1727204533.66755: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204533.66760: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204533.66778: getting variables 46400 1727204533.66780: in VariableManager get_vars() 46400 1727204533.66813: Calling all_inventory to load vars for managed-node2 46400 1727204533.66816: Calling groups_inventory to load vars for managed-node2 46400 1727204533.66818: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204533.66829: Calling all_plugins_play to load vars for managed-node2 46400 1727204533.66832: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204533.66834: Calling groups_plugins_play to load vars for managed-node2 46400 1727204533.67821: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073e 46400 1727204533.67825: WORKER PROCESS EXITING 46400 1727204533.68415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204533.69309: done with get_vars() 46400 1727204533.69325: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.041) 0:00:23.978 ***** 46400 1727204533.69392: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204533.69616: worker is 1 (out of 1 available) 46400 1727204533.69629: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204533.69643: done queuing things up, now waiting for results queue to drain 46400 1727204533.69645: waiting for pending results... 46400 1727204533.69874: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204533.70025: in run() - task 0affcd87-79f5-1303-fda8-00000000073f 46400 1727204533.70042: variable 'ansible_search_path' from source: unknown 46400 1727204533.70048: variable 'ansible_search_path' from source: unknown 46400 1727204533.70085: calling self._execute() 46400 1727204533.70179: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.70190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.70203: variable 'omit' from source: magic vars 46400 1727204533.70582: variable 'ansible_distribution_major_version' from source: facts 46400 1727204533.70599: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204533.70609: variable 'omit' from source: magic vars 46400 1727204533.70678: variable 'omit' from source: magic vars 46400 1727204533.70837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204533.73171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204533.73237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204533.73278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204533.73314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204533.73340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204533.73423: variable 'network_provider' from source: set_fact 46400 1727204533.73549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204533.73587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204533.73617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204533.73662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204533.73686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204533.73769: variable 'omit' from source: magic vars 46400 1727204533.73876: variable 'omit' from source: magic vars 46400 1727204533.73977: variable 'network_connections' from source: include params 46400 1727204533.73992: variable 'interface' from source: play vars 46400 1727204533.74054: variable 'interface' from source: play vars 46400 1727204533.74207: variable 'omit' from source: magic vars 46400 1727204533.74222: variable '__lsr_ansible_managed' from source: task vars 46400 1727204533.74281: variable '__lsr_ansible_managed' from source: task vars 46400 1727204533.74466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204533.74678: Loaded config def from plugin (lookup/template) 46400 1727204533.74687: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204533.74717: File lookup term: get_ansible_managed.j2 46400 1727204533.74724: variable 'ansible_search_path' from source: unknown 46400 1727204533.74733: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204533.74751: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204533.74778: variable 'ansible_search_path' from source: unknown 46400 1727204533.81106: variable 'ansible_managed' from source: unknown 46400 1727204533.81274: variable 'omit' from source: magic vars 46400 1727204533.81311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204533.81343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204533.81368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204533.81395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204533.81409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204533.81438: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204533.81446: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.81453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.81550: Set connection var ansible_shell_type to sh 46400 1727204533.81566: Set connection var ansible_shell_executable to /bin/sh 46400 1727204533.81577: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204533.81585: Set connection var ansible_connection to ssh 46400 1727204533.81596: Set connection var ansible_pipelining to False 46400 1727204533.81606: Set connection var ansible_timeout to 10 46400 1727204533.81637: variable 'ansible_shell_executable' from source: unknown 46400 1727204533.81646: variable 'ansible_connection' from source: unknown 46400 1727204533.81653: variable 'ansible_module_compression' from source: unknown 46400 1727204533.81658: variable 'ansible_shell_type' from source: unknown 46400 1727204533.81666: variable 'ansible_shell_executable' from source: unknown 46400 1727204533.81674: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204533.81681: variable 'ansible_pipelining' from source: unknown 46400 1727204533.81687: variable 'ansible_timeout' from source: unknown 46400 1727204533.81693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204533.81831: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204533.81853: variable 'omit' from source: magic vars 46400 1727204533.81863: starting attempt loop 46400 1727204533.81872: running the handler 46400 1727204533.81888: _low_level_execute_command(): starting 46400 1727204533.81898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204533.82643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204533.82658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.82677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.82696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.82740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.82752: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204533.82766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.82783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204533.82797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204533.82807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204533.82818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.82830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.82845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.82856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.82868: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204533.82881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.82961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.82986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.83003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.83078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.84741: stdout chunk (state=3): >>>/root <<< 46400 1727204533.84825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.84927: stderr chunk (state=3): >>><<< 46400 1727204533.84941: stdout chunk (state=3): >>><<< 46400 1727204533.85072: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.85076: _low_level_execute_command(): starting 46400 1727204533.85080: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258 `" && echo ansible-tmp-1727204533.8497825-48252-142158352587258="` echo /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258 `" ) && sleep 0' 46400 1727204533.85972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.85976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.86019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204533.86022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.86025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204533.86027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.86930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.86951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.86983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.88833: stdout chunk (state=3): >>>ansible-tmp-1727204533.8497825-48252-142158352587258=/root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258 <<< 46400 1727204533.88938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.89017: stderr chunk (state=3): >>><<< 46400 1727204533.89021: stdout chunk (state=3): >>><<< 46400 1727204533.89071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204533.8497825-48252-142158352587258=/root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204533.89372: variable 'ansible_module_compression' from source: unknown 46400 1727204533.89376: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204533.89379: variable 'ansible_facts' from source: unknown 46400 1727204533.89384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/AnsiballZ_network_connections.py 46400 1727204533.89924: Sending initial data 46400 1727204533.89927: Sent initial data (168 bytes) 46400 1727204533.92222: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.92227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.92251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.92255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204533.92258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.92529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.92714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.92746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204533.94480: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204533.94540: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204533.94577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpvx7pknjz /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/AnsiballZ_network_connections.py <<< 46400 1727204533.94593: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204533.96606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204533.96610: stderr chunk (state=3): >>><<< 46400 1727204533.96612: stdout chunk (state=3): >>><<< 46400 1727204533.96635: done transferring module to remote 46400 1727204533.96645: _low_level_execute_command(): starting 46400 1727204533.96650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/ /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/AnsiballZ_network_connections.py && sleep 0' 46400 1727204533.98710: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204533.98720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.98731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.98747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.98792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.98799: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204533.98809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.98823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204533.98830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204533.98837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204533.98845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204533.98854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204533.98870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204533.98878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204533.98884: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204533.98894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204533.98962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204533.98981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204533.98991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204533.99055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.00831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.00835: stdout chunk (state=3): >>><<< 46400 1727204534.00843: stderr chunk (state=3): >>><<< 46400 1727204534.00863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.00877: _low_level_execute_command(): starting 46400 1727204534.00880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/AnsiballZ_network_connections.py && sleep 0' 46400 1727204534.02808: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.02825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.02841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.02866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.02910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.02923: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.02943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.02968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.02982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.02995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.03008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.03024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.03040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.03053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.03065: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.03083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.03158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.03187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.03204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.03283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.26305: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204534.27826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204534.27831: stdout chunk (state=3): >>><<< 46400 1727204534.27833: stderr chunk (state=3): >>><<< 46400 1727204534.27983: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204534.27987: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204534.27991: _low_level_execute_command(): starting 46400 1727204534.27993: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204533.8497825-48252-142158352587258/ > /dev/null 2>&1 && sleep 0' 46400 1727204534.28568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.28585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.28601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.28618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.28662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.28679: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.28692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.28709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.28721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.28730: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.28741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.28754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.28771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.28783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.28793: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.28805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.28888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.28904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.28919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.29001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.30862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.30868: stdout chunk (state=3): >>><<< 46400 1727204534.30877: stderr chunk (state=3): >>><<< 46400 1727204534.30894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.30901: handler run complete 46400 1727204534.30936: attempt loop complete, returning result 46400 1727204534.30939: _execute() done 46400 1727204534.30941: dumping result to json 46400 1727204534.30947: done dumping result, returning 46400 1727204534.30956: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-00000000073f] 46400 1727204534.30961: sending task result for task 0affcd87-79f5-1303-fda8-00000000073f 46400 1727204534.31079: done sending task result for task 0affcd87-79f5-1303-fda8-00000000073f 46400 1727204534.31082: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461 46400 1727204534.31196: no more pending results, returning what we have 46400 1727204534.31200: results queue empty 46400 1727204534.31201: checking for any_errors_fatal 46400 1727204534.31207: done checking for any_errors_fatal 46400 1727204534.31208: checking for max_fail_percentage 46400 1727204534.31209: done checking for max_fail_percentage 46400 1727204534.31210: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.31211: done checking to see if all hosts have failed 46400 1727204534.31212: getting the remaining hosts for this loop 46400 1727204534.31213: done getting the remaining hosts for this loop 46400 1727204534.31217: getting the next task for host managed-node2 46400 1727204534.31224: done getting next task for host managed-node2 46400 1727204534.31228: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204534.31232: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.31243: getting variables 46400 1727204534.31244: in VariableManager get_vars() 46400 1727204534.31279: Calling all_inventory to load vars for managed-node2 46400 1727204534.31282: Calling groups_inventory to load vars for managed-node2 46400 1727204534.31284: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.31293: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.31295: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.31297: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.32625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.34222: done with get_vars() 46400 1727204534.34249: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:14 -0400 (0:00:00.649) 0:00:24.627 ***** 46400 1727204534.34340: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204534.34675: worker is 1 (out of 1 available) 46400 1727204534.34689: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204534.34702: done queuing things up, now waiting for results queue to drain 46400 1727204534.34705: waiting for pending results... 46400 1727204534.35007: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204534.35155: in run() - task 0affcd87-79f5-1303-fda8-000000000740 46400 1727204534.35177: variable 'ansible_search_path' from source: unknown 46400 1727204534.35185: variable 'ansible_search_path' from source: unknown 46400 1727204534.35223: calling self._execute() 46400 1727204534.35334: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.35345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.35359: variable 'omit' from source: magic vars 46400 1727204534.35722: variable 'ansible_distribution_major_version' from source: facts 46400 1727204534.35741: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204534.35872: variable 'network_state' from source: role '' defaults 46400 1727204534.35888: Evaluated conditional (network_state != {}): False 46400 1727204534.35896: when evaluation is False, skipping this task 46400 1727204534.35907: _execute() done 46400 1727204534.35915: dumping result to json 46400 1727204534.35922: done dumping result, returning 46400 1727204534.35932: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000000740] 46400 1727204534.35944: sending task result for task 0affcd87-79f5-1303-fda8-000000000740 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204534.36095: no more pending results, returning what we have 46400 1727204534.36101: results queue empty 46400 1727204534.36102: checking for any_errors_fatal 46400 1727204534.36113: done checking for any_errors_fatal 46400 1727204534.36114: checking for max_fail_percentage 46400 1727204534.36116: done checking for max_fail_percentage 46400 1727204534.36117: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.36118: done checking to see if all hosts have failed 46400 1727204534.36118: getting the remaining hosts for this loop 46400 1727204534.36120: done getting the remaining hosts for this loop 46400 1727204534.36124: getting the next task for host managed-node2 46400 1727204534.36134: done getting next task for host managed-node2 46400 1727204534.36138: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204534.36145: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.36163: getting variables 46400 1727204534.36167: in VariableManager get_vars() 46400 1727204534.36204: Calling all_inventory to load vars for managed-node2 46400 1727204534.36207: Calling groups_inventory to load vars for managed-node2 46400 1727204534.36209: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.36222: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.36225: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.36228: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.37484: done sending task result for task 0affcd87-79f5-1303-fda8-000000000740 46400 1727204534.37487: WORKER PROCESS EXITING 46400 1727204534.38088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.39722: done with get_vars() 46400 1727204534.39747: done getting variables 46400 1727204534.39810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:14 -0400 (0:00:00.055) 0:00:24.683 ***** 46400 1727204534.39846: entering _queue_task() for managed-node2/debug 46400 1727204534.40168: worker is 1 (out of 1 available) 46400 1727204534.40181: exiting _queue_task() for managed-node2/debug 46400 1727204534.40194: done queuing things up, now waiting for results queue to drain 46400 1727204534.40196: waiting for pending results... 46400 1727204534.40483: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204534.40646: in run() - task 0affcd87-79f5-1303-fda8-000000000741 46400 1727204534.40670: variable 'ansible_search_path' from source: unknown 46400 1727204534.40678: variable 'ansible_search_path' from source: unknown 46400 1727204534.40716: calling self._execute() 46400 1727204534.40814: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.40825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.40839: variable 'omit' from source: magic vars 46400 1727204534.41203: variable 'ansible_distribution_major_version' from source: facts 46400 1727204534.41220: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204534.41231: variable 'omit' from source: magic vars 46400 1727204534.41305: variable 'omit' from source: magic vars 46400 1727204534.41344: variable 'omit' from source: magic vars 46400 1727204534.41394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204534.41436: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204534.41462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204534.41487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.41506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.41542: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204534.41552: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.41560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.41670: Set connection var ansible_shell_type to sh 46400 1727204534.41687: Set connection var ansible_shell_executable to /bin/sh 46400 1727204534.41698: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204534.41707: Set connection var ansible_connection to ssh 46400 1727204534.41717: Set connection var ansible_pipelining to False 46400 1727204534.41731: Set connection var ansible_timeout to 10 46400 1727204534.41761: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.41772: variable 'ansible_connection' from source: unknown 46400 1727204534.41780: variable 'ansible_module_compression' from source: unknown 46400 1727204534.41788: variable 'ansible_shell_type' from source: unknown 46400 1727204534.41795: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.41802: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.41810: variable 'ansible_pipelining' from source: unknown 46400 1727204534.41817: variable 'ansible_timeout' from source: unknown 46400 1727204534.41824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.41975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204534.41994: variable 'omit' from source: magic vars 46400 1727204534.42004: starting attempt loop 46400 1727204534.42011: running the handler 46400 1727204534.42144: variable '__network_connections_result' from source: set_fact 46400 1727204534.42209: handler run complete 46400 1727204534.42232: attempt loop complete, returning result 46400 1727204534.42240: _execute() done 46400 1727204534.42248: dumping result to json 46400 1727204534.42256: done dumping result, returning 46400 1727204534.42272: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000000741] 46400 1727204534.42284: sending task result for task 0affcd87-79f5-1303-fda8-000000000741 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461" ] } 46400 1727204534.42446: no more pending results, returning what we have 46400 1727204534.42451: results queue empty 46400 1727204534.42452: checking for any_errors_fatal 46400 1727204534.42459: done checking for any_errors_fatal 46400 1727204534.42460: checking for max_fail_percentage 46400 1727204534.42462: done checking for max_fail_percentage 46400 1727204534.42463: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.42466: done checking to see if all hosts have failed 46400 1727204534.42467: getting the remaining hosts for this loop 46400 1727204534.42469: done getting the remaining hosts for this loop 46400 1727204534.42473: getting the next task for host managed-node2 46400 1727204534.42482: done getting next task for host managed-node2 46400 1727204534.42487: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204534.42492: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.42505: getting variables 46400 1727204534.42507: in VariableManager get_vars() 46400 1727204534.42544: Calling all_inventory to load vars for managed-node2 46400 1727204534.42547: Calling groups_inventory to load vars for managed-node2 46400 1727204534.42550: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.42565: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.42568: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.42572: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.43587: done sending task result for task 0affcd87-79f5-1303-fda8-000000000741 46400 1727204534.43591: WORKER PROCESS EXITING 46400 1727204534.44258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.46010: done with get_vars() 46400 1727204534.46034: done getting variables 46400 1727204534.46100: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:14 -0400 (0:00:00.062) 0:00:24.745 ***** 46400 1727204534.46140: entering _queue_task() for managed-node2/debug 46400 1727204534.46471: worker is 1 (out of 1 available) 46400 1727204534.46485: exiting _queue_task() for managed-node2/debug 46400 1727204534.46497: done queuing things up, now waiting for results queue to drain 46400 1727204534.46499: waiting for pending results... 46400 1727204534.46793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204534.46940: in run() - task 0affcd87-79f5-1303-fda8-000000000742 46400 1727204534.46962: variable 'ansible_search_path' from source: unknown 46400 1727204534.46974: variable 'ansible_search_path' from source: unknown 46400 1727204534.47013: calling self._execute() 46400 1727204534.47113: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.47125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.47138: variable 'omit' from source: magic vars 46400 1727204534.47513: variable 'ansible_distribution_major_version' from source: facts 46400 1727204534.47531: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204534.47542: variable 'omit' from source: magic vars 46400 1727204534.47617: variable 'omit' from source: magic vars 46400 1727204534.47660: variable 'omit' from source: magic vars 46400 1727204534.47715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204534.47756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204534.47787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204534.47811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.47831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.47869: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204534.47879: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.47887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.47993: Set connection var ansible_shell_type to sh 46400 1727204534.48009: Set connection var ansible_shell_executable to /bin/sh 46400 1727204534.48019: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204534.48031: Set connection var ansible_connection to ssh 46400 1727204534.48046: Set connection var ansible_pipelining to False 46400 1727204534.48056: Set connection var ansible_timeout to 10 46400 1727204534.48087: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.48096: variable 'ansible_connection' from source: unknown 46400 1727204534.48103: variable 'ansible_module_compression' from source: unknown 46400 1727204534.48110: variable 'ansible_shell_type' from source: unknown 46400 1727204534.48116: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.48122: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.48130: variable 'ansible_pipelining' from source: unknown 46400 1727204534.48137: variable 'ansible_timeout' from source: unknown 46400 1727204534.48149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.48301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204534.48317: variable 'omit' from source: magic vars 46400 1727204534.48328: starting attempt loop 46400 1727204534.48335: running the handler 46400 1727204534.48393: variable '__network_connections_result' from source: set_fact 46400 1727204534.48485: variable '__network_connections_result' from source: set_fact 46400 1727204534.48613: handler run complete 46400 1727204534.48645: attempt loop complete, returning result 46400 1727204534.48653: _execute() done 46400 1727204534.48660: dumping result to json 46400 1727204534.48671: done dumping result, returning 46400 1727204534.48684: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000000742] 46400 1727204534.48698: sending task result for task 0affcd87-79f5-1303-fda8-000000000742 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461" ] } } 46400 1727204534.48906: no more pending results, returning what we have 46400 1727204534.48911: results queue empty 46400 1727204534.48912: checking for any_errors_fatal 46400 1727204534.48920: done checking for any_errors_fatal 46400 1727204534.48921: checking for max_fail_percentage 46400 1727204534.48922: done checking for max_fail_percentage 46400 1727204534.48924: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.48924: done checking to see if all hosts have failed 46400 1727204534.48925: getting the remaining hosts for this loop 46400 1727204534.48927: done getting the remaining hosts for this loop 46400 1727204534.48931: getting the next task for host managed-node2 46400 1727204534.48941: done getting next task for host managed-node2 46400 1727204534.48945: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204534.48950: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.48962: getting variables 46400 1727204534.48966: in VariableManager get_vars() 46400 1727204534.49006: Calling all_inventory to load vars for managed-node2 46400 1727204534.49009: Calling groups_inventory to load vars for managed-node2 46400 1727204534.49018: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.49029: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.49032: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.49035: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.49984: done sending task result for task 0affcd87-79f5-1303-fda8-000000000742 46400 1727204534.49987: WORKER PROCESS EXITING 46400 1727204534.50519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.51422: done with get_vars() 46400 1727204534.51442: done getting variables 46400 1727204534.51490: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:14 -0400 (0:00:00.053) 0:00:24.799 ***** 46400 1727204534.51517: entering _queue_task() for managed-node2/debug 46400 1727204534.51758: worker is 1 (out of 1 available) 46400 1727204534.51774: exiting _queue_task() for managed-node2/debug 46400 1727204534.51787: done queuing things up, now waiting for results queue to drain 46400 1727204534.51789: waiting for pending results... 46400 1727204534.52047: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204534.52219: in run() - task 0affcd87-79f5-1303-fda8-000000000743 46400 1727204534.52241: variable 'ansible_search_path' from source: unknown 46400 1727204534.52249: variable 'ansible_search_path' from source: unknown 46400 1727204534.52292: calling self._execute() 46400 1727204534.52395: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.52406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.52427: variable 'omit' from source: magic vars 46400 1727204534.52807: variable 'ansible_distribution_major_version' from source: facts 46400 1727204534.52826: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204534.52960: variable 'network_state' from source: role '' defaults 46400 1727204534.52985: Evaluated conditional (network_state != {}): False 46400 1727204534.52993: when evaluation is False, skipping this task 46400 1727204534.53000: _execute() done 46400 1727204534.53008: dumping result to json 46400 1727204534.53016: done dumping result, returning 46400 1727204534.53027: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000000743] 46400 1727204534.53039: sending task result for task 0affcd87-79f5-1303-fda8-000000000743 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204534.53192: no more pending results, returning what we have 46400 1727204534.53200: results queue empty 46400 1727204534.53202: checking for any_errors_fatal 46400 1727204534.53212: done checking for any_errors_fatal 46400 1727204534.53213: checking for max_fail_percentage 46400 1727204534.53217: done checking for max_fail_percentage 46400 1727204534.53219: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.53219: done checking to see if all hosts have failed 46400 1727204534.53220: getting the remaining hosts for this loop 46400 1727204534.53222: done getting the remaining hosts for this loop 46400 1727204534.53226: getting the next task for host managed-node2 46400 1727204534.53239: done getting next task for host managed-node2 46400 1727204534.53244: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204534.53252: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.53274: getting variables 46400 1727204534.53276: in VariableManager get_vars() 46400 1727204534.53311: Calling all_inventory to load vars for managed-node2 46400 1727204534.53313: Calling groups_inventory to load vars for managed-node2 46400 1727204534.53315: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.53327: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.53329: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.53331: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.58878: done sending task result for task 0affcd87-79f5-1303-fda8-000000000743 46400 1727204534.58884: WORKER PROCESS EXITING 46400 1727204534.58997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.60793: done with get_vars() 46400 1727204534.60831: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:14 -0400 (0:00:00.094) 0:00:24.893 ***** 46400 1727204534.60933: entering _queue_task() for managed-node2/ping 46400 1727204534.61302: worker is 1 (out of 1 available) 46400 1727204534.61315: exiting _queue_task() for managed-node2/ping 46400 1727204534.61327: done queuing things up, now waiting for results queue to drain 46400 1727204534.61328: waiting for pending results... 46400 1727204534.61637: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204534.61813: in run() - task 0affcd87-79f5-1303-fda8-000000000744 46400 1727204534.61835: variable 'ansible_search_path' from source: unknown 46400 1727204534.61843: variable 'ansible_search_path' from source: unknown 46400 1727204534.61893: calling self._execute() 46400 1727204534.61996: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.62011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.62027: variable 'omit' from source: magic vars 46400 1727204534.62446: variable 'ansible_distribution_major_version' from source: facts 46400 1727204534.62468: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204534.62480: variable 'omit' from source: magic vars 46400 1727204534.62558: variable 'omit' from source: magic vars 46400 1727204534.62603: variable 'omit' from source: magic vars 46400 1727204534.62657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204534.62701: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204534.62729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204534.62756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.62780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204534.62818: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204534.62827: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.62835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.62964: Set connection var ansible_shell_type to sh 46400 1727204534.62990: Set connection var ansible_shell_executable to /bin/sh 46400 1727204534.63000: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204534.63010: Set connection var ansible_connection to ssh 46400 1727204534.63020: Set connection var ansible_pipelining to False 46400 1727204534.63030: Set connection var ansible_timeout to 10 46400 1727204534.63065: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.63077: variable 'ansible_connection' from source: unknown 46400 1727204534.63088: variable 'ansible_module_compression' from source: unknown 46400 1727204534.63099: variable 'ansible_shell_type' from source: unknown 46400 1727204534.63105: variable 'ansible_shell_executable' from source: unknown 46400 1727204534.63112: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204534.63120: variable 'ansible_pipelining' from source: unknown 46400 1727204534.63126: variable 'ansible_timeout' from source: unknown 46400 1727204534.63133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204534.63382: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204534.63401: variable 'omit' from source: magic vars 46400 1727204534.63417: starting attempt loop 46400 1727204534.63427: running the handler 46400 1727204534.63445: _low_level_execute_command(): starting 46400 1727204534.63457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204534.64254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.64277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.64299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.64320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.64368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.64382: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.64403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.64424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.64436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.64448: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.64466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.64483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.64504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.64521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.64534: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.64549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.64628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.64648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.64668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.64748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.66400: stdout chunk (state=3): >>>/root <<< 46400 1727204534.66578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.66582: stdout chunk (state=3): >>><<< 46400 1727204534.66591: stderr chunk (state=3): >>><<< 46400 1727204534.66616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.66629: _low_level_execute_command(): starting 46400 1727204534.66635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519 `" && echo ansible-tmp-1727204534.6661513-48338-76777119063519="` echo /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519 `" ) && sleep 0' 46400 1727204534.67271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.67281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.67291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.67305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.67344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.67351: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.67365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.67378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.67386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.67392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.67401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.67410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.67421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.67428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.67435: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.67444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.67516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.67531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.67540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.67607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.69470: stdout chunk (state=3): >>>ansible-tmp-1727204534.6661513-48338-76777119063519=/root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519 <<< 46400 1727204534.69649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.69653: stdout chunk (state=3): >>><<< 46400 1727204534.69662: stderr chunk (state=3): >>><<< 46400 1727204534.69682: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204534.6661513-48338-76777119063519=/root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.69728: variable 'ansible_module_compression' from source: unknown 46400 1727204534.69767: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204534.69802: variable 'ansible_facts' from source: unknown 46400 1727204534.69876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/AnsiballZ_ping.py 46400 1727204534.70012: Sending initial data 46400 1727204534.70016: Sent initial data (152 bytes) 46400 1727204534.70917: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.70926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.70937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.70951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.70992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.70999: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.71009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.71023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.71030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.71037: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.71045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.71056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.71069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.71085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.71091: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.71101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.71172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.71186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.71196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.71269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.72964: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204534.72998: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204534.73037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpy0a4rpzx /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/AnsiballZ_ping.py <<< 46400 1727204534.73075: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204534.74128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.74409: stderr chunk (state=3): >>><<< 46400 1727204534.74412: stdout chunk (state=3): >>><<< 46400 1727204534.74414: done transferring module to remote 46400 1727204534.74417: _low_level_execute_command(): starting 46400 1727204534.74423: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/ /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/AnsiballZ_ping.py && sleep 0' 46400 1727204534.75035: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.75050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.75076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.75095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.75137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.75151: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.75173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.75196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.75209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.75222: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.75234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.75249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.75269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.75287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.75301: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.75315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.75401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.75426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.75442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.75519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.77220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.77314: stderr chunk (state=3): >>><<< 46400 1727204534.77325: stdout chunk (state=3): >>><<< 46400 1727204534.77373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.77377: _low_level_execute_command(): starting 46400 1727204534.77380: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/AnsiballZ_ping.py && sleep 0' 46400 1727204534.78047: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204534.78067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.78083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.78101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.78150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.78168: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204534.78184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.78203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204534.78215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204534.78226: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204534.78244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204534.78258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204534.78279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204534.78292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204534.78304: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204534.78319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.78404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.78426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.78445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.78532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.91493: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204534.92473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204534.92499: stdout chunk (state=3): >>><<< 46400 1727204534.92503: stderr chunk (state=3): >>><<< 46400 1727204534.92579: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204534.92583: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204534.92590: _low_level_execute_command(): starting 46400 1727204534.92592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204534.6661513-48338-76777119063519/ > /dev/null 2>&1 && sleep 0' 46400 1727204534.93446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204534.93514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204534.93529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204534.93539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204534.93599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204534.96023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204534.96027: stdout chunk (state=3): >>><<< 46400 1727204534.96029: stderr chunk (state=3): >>><<< 46400 1727204534.96174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204534.96177: handler run complete 46400 1727204534.96180: attempt loop complete, returning result 46400 1727204534.96182: _execute() done 46400 1727204534.96184: dumping result to json 46400 1727204534.96185: done dumping result, returning 46400 1727204534.96187: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000000744] 46400 1727204534.96189: sending task result for task 0affcd87-79f5-1303-fda8-000000000744 46400 1727204534.96271: done sending task result for task 0affcd87-79f5-1303-fda8-000000000744 46400 1727204534.96276: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204534.96343: no more pending results, returning what we have 46400 1727204534.96347: results queue empty 46400 1727204534.96351: checking for any_errors_fatal 46400 1727204534.96359: done checking for any_errors_fatal 46400 1727204534.96361: checking for max_fail_percentage 46400 1727204534.96363: done checking for max_fail_percentage 46400 1727204534.96365: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.96366: done checking to see if all hosts have failed 46400 1727204534.96367: getting the remaining hosts for this loop 46400 1727204534.96369: done getting the remaining hosts for this loop 46400 1727204534.96372: getting the next task for host managed-node2 46400 1727204534.96387: done getting next task for host managed-node2 46400 1727204534.96390: ^ task is: TASK: meta (role_complete) 46400 1727204534.96395: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.96406: getting variables 46400 1727204534.96408: in VariableManager get_vars() 46400 1727204534.96445: Calling all_inventory to load vars for managed-node2 46400 1727204534.96450: Calling groups_inventory to load vars for managed-node2 46400 1727204534.96452: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.96470: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.96473: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.96476: Calling groups_plugins_play to load vars for managed-node2 46400 1727204534.98200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204534.99733: done with get_vars() 46400 1727204534.99753: done getting variables 46400 1727204534.99818: done queuing things up, now waiting for results queue to drain 46400 1727204534.99820: results queue empty 46400 1727204534.99820: checking for any_errors_fatal 46400 1727204534.99822: done checking for any_errors_fatal 46400 1727204534.99823: checking for max_fail_percentage 46400 1727204534.99824: done checking for max_fail_percentage 46400 1727204534.99824: checking to see if all hosts have failed and the running result is not ok 46400 1727204534.99825: done checking to see if all hosts have failed 46400 1727204534.99825: getting the remaining hosts for this loop 46400 1727204534.99826: done getting the remaining hosts for this loop 46400 1727204534.99828: getting the next task for host managed-node2 46400 1727204534.99831: done getting next task for host managed-node2 46400 1727204534.99833: ^ task is: TASK: Show result 46400 1727204534.99834: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204534.99841: getting variables 46400 1727204534.99842: in VariableManager get_vars() 46400 1727204534.99851: Calling all_inventory to load vars for managed-node2 46400 1727204534.99852: Calling groups_inventory to load vars for managed-node2 46400 1727204534.99854: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204534.99858: Calling all_plugins_play to load vars for managed-node2 46400 1727204534.99859: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204534.99861: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.01385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.02543: done with get_vars() 46400 1727204535.02567: done getting variables 46400 1727204535.02603: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.416) 0:00:25.310 ***** 46400 1727204535.02627: entering _queue_task() for managed-node2/debug 46400 1727204535.02923: worker is 1 (out of 1 available) 46400 1727204535.02939: exiting _queue_task() for managed-node2/debug 46400 1727204535.02952: done queuing things up, now waiting for results queue to drain 46400 1727204535.02953: waiting for pending results... 46400 1727204535.03138: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204535.03225: in run() - task 0affcd87-79f5-1303-fda8-0000000006b2 46400 1727204535.03237: variable 'ansible_search_path' from source: unknown 46400 1727204535.03242: variable 'ansible_search_path' from source: unknown 46400 1727204535.03276: calling self._execute() 46400 1727204535.03346: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.03353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.03366: variable 'omit' from source: magic vars 46400 1727204535.03710: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.03713: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.03717: variable 'omit' from source: magic vars 46400 1727204535.03767: variable 'omit' from source: magic vars 46400 1727204535.03783: variable 'omit' from source: magic vars 46400 1727204535.03871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204535.03878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204535.03977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204535.03981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.03984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.04028: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204535.04033: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.04087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.04125: Set connection var ansible_shell_type to sh 46400 1727204535.04130: Set connection var ansible_shell_executable to /bin/sh 46400 1727204535.04145: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204535.04154: Set connection var ansible_connection to ssh 46400 1727204535.04217: Set connection var ansible_pipelining to False 46400 1727204535.04224: Set connection var ansible_timeout to 10 46400 1727204535.04242: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.04245: variable 'ansible_connection' from source: unknown 46400 1727204535.04248: variable 'ansible_module_compression' from source: unknown 46400 1727204535.04250: variable 'ansible_shell_type' from source: unknown 46400 1727204535.04313: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.04325: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.04328: variable 'ansible_pipelining' from source: unknown 46400 1727204535.04355: variable 'ansible_timeout' from source: unknown 46400 1727204535.04362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.04411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204535.04420: variable 'omit' from source: magic vars 46400 1727204535.04426: starting attempt loop 46400 1727204535.04428: running the handler 46400 1727204535.04519: variable '__network_connections_result' from source: set_fact 46400 1727204535.04611: variable '__network_connections_result' from source: set_fact 46400 1727204535.04716: handler run complete 46400 1727204535.04741: attempt loop complete, returning result 46400 1727204535.04744: _execute() done 46400 1727204535.04747: dumping result to json 46400 1727204535.04750: done dumping result, returning 46400 1727204535.04753: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-0000000006b2] 46400 1727204535.04759: sending task result for task 0affcd87-79f5-1303-fda8-0000000006b2 46400 1727204535.04861: done sending task result for task 0affcd87-79f5-1303-fda8-0000000006b2 46400 1727204535.04866: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 01d469d6-102d-4f29-8240-bb96e82c7461" ] } } 46400 1727204535.04941: no more pending results, returning what we have 46400 1727204535.04946: results queue empty 46400 1727204535.04947: checking for any_errors_fatal 46400 1727204535.04949: done checking for any_errors_fatal 46400 1727204535.04950: checking for max_fail_percentage 46400 1727204535.04956: done checking for max_fail_percentage 46400 1727204535.04957: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.04958: done checking to see if all hosts have failed 46400 1727204535.04959: getting the remaining hosts for this loop 46400 1727204535.04962: done getting the remaining hosts for this loop 46400 1727204535.04968: getting the next task for host managed-node2 46400 1727204535.04976: done getting next task for host managed-node2 46400 1727204535.04980: ^ task is: TASK: Asserts 46400 1727204535.04983: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.04987: getting variables 46400 1727204535.04988: in VariableManager get_vars() 46400 1727204535.05016: Calling all_inventory to load vars for managed-node2 46400 1727204535.05018: Calling groups_inventory to load vars for managed-node2 46400 1727204535.05021: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.05031: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.05033: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.05036: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.07980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.09684: done with get_vars() 46400 1727204535.09708: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.071) 0:00:25.382 ***** 46400 1727204535.09809: entering _queue_task() for managed-node2/include_tasks 46400 1727204535.10094: worker is 1 (out of 1 available) 46400 1727204535.10110: exiting _queue_task() for managed-node2/include_tasks 46400 1727204535.10123: done queuing things up, now waiting for results queue to drain 46400 1727204535.10125: waiting for pending results... 46400 1727204535.10310: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204535.10483: in run() - task 0affcd87-79f5-1303-fda8-0000000005b9 46400 1727204535.10488: variable 'ansible_search_path' from source: unknown 46400 1727204535.10491: variable 'ansible_search_path' from source: unknown 46400 1727204535.10493: variable 'lsr_assert' from source: include params 46400 1727204535.10741: variable 'lsr_assert' from source: include params 46400 1727204535.10833: variable 'omit' from source: magic vars 46400 1727204535.10958: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.10969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.10990: variable 'omit' from source: magic vars 46400 1727204535.11223: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.11232: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.11239: variable 'item' from source: unknown 46400 1727204535.11288: variable 'item' from source: unknown 46400 1727204535.11333: variable 'item' from source: unknown 46400 1727204535.11421: variable 'item' from source: unknown 46400 1727204535.11610: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.11625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.11629: variable 'omit' from source: magic vars 46400 1727204535.11720: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.11725: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.11728: variable 'item' from source: unknown 46400 1727204535.11789: variable 'item' from source: unknown 46400 1727204535.11828: variable 'item' from source: unknown 46400 1727204535.11916: variable 'item' from source: unknown 46400 1727204535.12001: dumping result to json 46400 1727204535.12003: done dumping result, returning 46400 1727204535.12005: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-0000000005b9] 46400 1727204535.12007: sending task result for task 0affcd87-79f5-1303-fda8-0000000005b9 46400 1727204535.12055: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005b9 46400 1727204535.12058: WORKER PROCESS EXITING 46400 1727204535.12096: no more pending results, returning what we have 46400 1727204535.12101: in VariableManager get_vars() 46400 1727204535.12143: Calling all_inventory to load vars for managed-node2 46400 1727204535.12146: Calling groups_inventory to load vars for managed-node2 46400 1727204535.12149: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.12189: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.12194: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.12197: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.13371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.14431: done with get_vars() 46400 1727204535.14448: variable 'ansible_search_path' from source: unknown 46400 1727204535.14449: variable 'ansible_search_path' from source: unknown 46400 1727204535.14483: variable 'ansible_search_path' from source: unknown 46400 1727204535.14484: variable 'ansible_search_path' from source: unknown 46400 1727204535.14501: we have included files to process 46400 1727204535.14502: generating all_blocks data 46400 1727204535.14504: done generating all_blocks data 46400 1727204535.14507: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204535.14508: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204535.14509: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204535.14604: in VariableManager get_vars() 46400 1727204535.14618: done with get_vars() 46400 1727204535.14736: done processing included file 46400 1727204535.14738: iterating over new_blocks loaded from include file 46400 1727204535.14739: in VariableManager get_vars() 46400 1727204535.14748: done with get_vars() 46400 1727204535.14750: filtering new block on tags 46400 1727204535.14779: done filtering new block on tags 46400 1727204535.14781: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 46400 1727204535.14785: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204535.14785: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204535.14788: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204535.14852: in VariableManager get_vars() 46400 1727204535.14868: done with get_vars() 46400 1727204535.15020: done processing included file 46400 1727204535.15022: iterating over new_blocks loaded from include file 46400 1727204535.15023: in VariableManager get_vars() 46400 1727204535.15032: done with get_vars() 46400 1727204535.15033: filtering new block on tags 46400 1727204535.15065: done filtering new block on tags 46400 1727204535.15067: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 46400 1727204535.15069: extending task lists for all hosts with included blocks 46400 1727204535.15875: done extending task lists 46400 1727204535.15876: done processing included files 46400 1727204535.15877: results queue empty 46400 1727204535.15877: checking for any_errors_fatal 46400 1727204535.15881: done checking for any_errors_fatal 46400 1727204535.15882: checking for max_fail_percentage 46400 1727204535.15882: done checking for max_fail_percentage 46400 1727204535.15886: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.15887: done checking to see if all hosts have failed 46400 1727204535.15888: getting the remaining hosts for this loop 46400 1727204535.15889: done getting the remaining hosts for this loop 46400 1727204535.15891: getting the next task for host managed-node2 46400 1727204535.15900: done getting next task for host managed-node2 46400 1727204535.15902: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204535.15905: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.15913: getting variables 46400 1727204535.15914: in VariableManager get_vars() 46400 1727204535.15924: Calling all_inventory to load vars for managed-node2 46400 1727204535.15926: Calling groups_inventory to load vars for managed-node2 46400 1727204535.15927: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.15932: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.15933: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.15935: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.16814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.18097: done with get_vars() 46400 1727204535.18116: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.083) 0:00:25.466 ***** 46400 1727204535.18187: entering _queue_task() for managed-node2/include_tasks 46400 1727204535.18476: worker is 1 (out of 1 available) 46400 1727204535.18491: exiting _queue_task() for managed-node2/include_tasks 46400 1727204535.18505: done queuing things up, now waiting for results queue to drain 46400 1727204535.18507: waiting for pending results... 46400 1727204535.18825: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204535.18931: in run() - task 0affcd87-79f5-1303-fda8-0000000008a8 46400 1727204535.18938: variable 'ansible_search_path' from source: unknown 46400 1727204535.18941: variable 'ansible_search_path' from source: unknown 46400 1727204535.19003: calling self._execute() 46400 1727204535.19106: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.19116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.19137: variable 'omit' from source: magic vars 46400 1727204535.19577: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.19599: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.19608: _execute() done 46400 1727204535.19619: dumping result to json 46400 1727204535.19623: done dumping result, returning 46400 1727204535.19625: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-0000000008a8] 46400 1727204535.19628: sending task result for task 0affcd87-79f5-1303-fda8-0000000008a8 46400 1727204535.19744: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008a8 46400 1727204535.19747: WORKER PROCESS EXITING 46400 1727204535.19833: no more pending results, returning what we have 46400 1727204535.19839: in VariableManager get_vars() 46400 1727204535.19903: Calling all_inventory to load vars for managed-node2 46400 1727204535.19906: Calling groups_inventory to load vars for managed-node2 46400 1727204535.19930: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.19941: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.19944: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.19946: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.20798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.22145: done with get_vars() 46400 1727204535.22162: variable 'ansible_search_path' from source: unknown 46400 1727204535.22163: variable 'ansible_search_path' from source: unknown 46400 1727204535.22170: variable 'item' from source: include params 46400 1727204535.22271: variable 'item' from source: include params 46400 1727204535.22298: we have included files to process 46400 1727204535.22299: generating all_blocks data 46400 1727204535.22300: done generating all_blocks data 46400 1727204535.22301: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204535.22302: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204535.22303: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204535.22430: done processing included file 46400 1727204535.22431: iterating over new_blocks loaded from include file 46400 1727204535.22432: in VariableManager get_vars() 46400 1727204535.22443: done with get_vars() 46400 1727204535.22445: filtering new block on tags 46400 1727204535.22463: done filtering new block on tags 46400 1727204535.22468: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204535.22476: extending task lists for all hosts with included blocks 46400 1727204535.22626: done extending task lists 46400 1727204535.22627: done processing included files 46400 1727204535.22627: results queue empty 46400 1727204535.22628: checking for any_errors_fatal 46400 1727204535.22631: done checking for any_errors_fatal 46400 1727204535.22631: checking for max_fail_percentage 46400 1727204535.22632: done checking for max_fail_percentage 46400 1727204535.22632: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.22633: done checking to see if all hosts have failed 46400 1727204535.22633: getting the remaining hosts for this loop 46400 1727204535.22634: done getting the remaining hosts for this loop 46400 1727204535.22636: getting the next task for host managed-node2 46400 1727204535.22639: done getting next task for host managed-node2 46400 1727204535.22641: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204535.22643: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.22645: getting variables 46400 1727204535.22645: in VariableManager get_vars() 46400 1727204535.22654: Calling all_inventory to load vars for managed-node2 46400 1727204535.22656: Calling groups_inventory to load vars for managed-node2 46400 1727204535.22658: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.22665: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.22668: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.22671: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.23605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.24665: done with get_vars() 46400 1727204535.24682: done getting variables 46400 1727204535.24771: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.066) 0:00:25.532 ***** 46400 1727204535.24793: entering _queue_task() for managed-node2/stat 46400 1727204535.25033: worker is 1 (out of 1 available) 46400 1727204535.25046: exiting _queue_task() for managed-node2/stat 46400 1727204535.25059: done queuing things up, now waiting for results queue to drain 46400 1727204535.25065: waiting for pending results... 46400 1727204535.25256: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204535.25359: in run() - task 0affcd87-79f5-1303-fda8-000000000928 46400 1727204535.25372: variable 'ansible_search_path' from source: unknown 46400 1727204535.25376: variable 'ansible_search_path' from source: unknown 46400 1727204535.25421: calling self._execute() 46400 1727204535.25502: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.25506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.25509: variable 'omit' from source: magic vars 46400 1727204535.25806: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.25830: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.25834: variable 'omit' from source: magic vars 46400 1727204535.25892: variable 'omit' from source: magic vars 46400 1727204535.25978: variable 'interface' from source: play vars 46400 1727204535.25992: variable 'omit' from source: magic vars 46400 1727204535.26045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204535.26067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204535.26083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204535.26105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.26125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.26140: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204535.26143: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.26146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.26217: Set connection var ansible_shell_type to sh 46400 1727204535.26225: Set connection var ansible_shell_executable to /bin/sh 46400 1727204535.26230: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204535.26250: Set connection var ansible_connection to ssh 46400 1727204535.26253: Set connection var ansible_pipelining to False 46400 1727204535.26258: Set connection var ansible_timeout to 10 46400 1727204535.26291: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.26295: variable 'ansible_connection' from source: unknown 46400 1727204535.26297: variable 'ansible_module_compression' from source: unknown 46400 1727204535.26300: variable 'ansible_shell_type' from source: unknown 46400 1727204535.26303: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.26307: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.26310: variable 'ansible_pipelining' from source: unknown 46400 1727204535.26313: variable 'ansible_timeout' from source: unknown 46400 1727204535.26315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.26498: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204535.26525: variable 'omit' from source: magic vars 46400 1727204535.26528: starting attempt loop 46400 1727204535.26531: running the handler 46400 1727204535.26533: _low_level_execute_command(): starting 46400 1727204535.26535: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204535.27209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.27225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.27239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204535.27258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.27272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.27330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.27339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.27400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.29110: stdout chunk (state=3): >>>/root <<< 46400 1727204535.29312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204535.29448: stderr chunk (state=3): >>><<< 46400 1727204535.29480: stdout chunk (state=3): >>><<< 46400 1727204535.29513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204535.29535: _low_level_execute_command(): starting 46400 1727204535.29540: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158 `" && echo ansible-tmp-1727204535.2951272-48410-55351196633158="` echo /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158 `" ) && sleep 0' 46400 1727204535.30402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204535.30419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.30443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.30470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.30518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.30543: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204535.30562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.30590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204535.30609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204535.30626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204535.30643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.30667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.30695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.30717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.30736: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204535.30755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.30866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.30984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204535.31018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.31155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.33050: stdout chunk (state=3): >>>ansible-tmp-1727204535.2951272-48410-55351196633158=/root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158 <<< 46400 1727204535.33317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204535.33324: stdout chunk (state=3): >>><<< 46400 1727204535.33330: stderr chunk (state=3): >>><<< 46400 1727204535.33447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204535.2951272-48410-55351196633158=/root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204535.33458: variable 'ansible_module_compression' from source: unknown 46400 1727204535.33541: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204535.33607: variable 'ansible_facts' from source: unknown 46400 1727204535.33728: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/AnsiballZ_stat.py 46400 1727204535.33898: Sending initial data 46400 1727204535.33901: Sent initial data (152 bytes) 46400 1727204535.34876: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204535.34892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.34909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.34928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.35000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204535.35003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204535.35006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.35008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204535.35010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.35078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.35081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.35133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.36834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204535.36868: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204535.36906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp56xb5xy3 /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/AnsiballZ_stat.py <<< 46400 1727204535.36942: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204535.38151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204535.38244: stderr chunk (state=3): >>><<< 46400 1727204535.38248: stdout chunk (state=3): >>><<< 46400 1727204535.38250: done transferring module to remote 46400 1727204535.38253: _low_level_execute_command(): starting 46400 1727204535.38255: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/ /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/AnsiballZ_stat.py && sleep 0' 46400 1727204535.38957: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204535.38974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.38989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.39003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.39040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.39048: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204535.39058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.39073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204535.39083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204535.39094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204535.39102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.39112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.39123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.39139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.39151: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204535.39169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.39253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.39352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204535.39376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.39455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.41370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204535.41374: stdout chunk (state=3): >>><<< 46400 1727204535.41377: stderr chunk (state=3): >>><<< 46400 1727204535.41380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204535.41382: _low_level_execute_command(): starting 46400 1727204535.41384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/AnsiballZ_stat.py && sleep 0' 46400 1727204535.42212: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204535.42227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.42243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.42270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.42312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.42326: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204535.42341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.42358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204535.42377: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204535.42390: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204535.42402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.42417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.42433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.42447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.42460: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204535.42479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.42560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.42585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204535.42609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.42691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.55692: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204535.56699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204535.56703: stdout chunk (state=3): >>><<< 46400 1727204535.56705: stderr chunk (state=3): >>><<< 46400 1727204535.56850: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204535.56859: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204535.56873: _low_level_execute_command(): starting 46400 1727204535.56876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204535.2951272-48410-55351196633158/ > /dev/null 2>&1 && sleep 0' 46400 1727204535.58697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204535.58793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.58817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.58836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.58980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.59001: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204535.59032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.59063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204535.59079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204535.59096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204535.59151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204535.59171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204535.59188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204535.59200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204535.59211: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204535.59224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204535.59411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204535.59428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204535.59442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204535.59569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204535.61386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204535.61416: stderr chunk (state=3): >>><<< 46400 1727204535.61420: stdout chunk (state=3): >>><<< 46400 1727204535.61674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204535.61678: handler run complete 46400 1727204535.61680: attempt loop complete, returning result 46400 1727204535.61683: _execute() done 46400 1727204535.61685: dumping result to json 46400 1727204535.61687: done dumping result, returning 46400 1727204535.61689: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000000928] 46400 1727204535.61691: sending task result for task 0affcd87-79f5-1303-fda8-000000000928 46400 1727204535.61776: done sending task result for task 0affcd87-79f5-1303-fda8-000000000928 46400 1727204535.61779: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204535.61843: no more pending results, returning what we have 46400 1727204535.61847: results queue empty 46400 1727204535.61848: checking for any_errors_fatal 46400 1727204535.61849: done checking for any_errors_fatal 46400 1727204535.61850: checking for max_fail_percentage 46400 1727204535.61852: done checking for max_fail_percentage 46400 1727204535.61853: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.61854: done checking to see if all hosts have failed 46400 1727204535.61855: getting the remaining hosts for this loop 46400 1727204535.61857: done getting the remaining hosts for this loop 46400 1727204535.61871: getting the next task for host managed-node2 46400 1727204535.61883: done getting next task for host managed-node2 46400 1727204535.61886: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 46400 1727204535.61890: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.61895: getting variables 46400 1727204535.61897: in VariableManager get_vars() 46400 1727204535.61934: Calling all_inventory to load vars for managed-node2 46400 1727204535.61937: Calling groups_inventory to load vars for managed-node2 46400 1727204535.61941: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.61954: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.61957: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.61965: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.63657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.65443: done with get_vars() 46400 1727204535.65476: done getting variables 46400 1727204535.65542: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204535.65681: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.409) 0:00:25.941 ***** 46400 1727204535.65713: entering _queue_task() for managed-node2/assert 46400 1727204535.66573: worker is 1 (out of 1 available) 46400 1727204535.66586: exiting _queue_task() for managed-node2/assert 46400 1727204535.66599: done queuing things up, now waiting for results queue to drain 46400 1727204535.66600: waiting for pending results... 46400 1727204535.67533: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 46400 1727204535.67638: in run() - task 0affcd87-79f5-1303-fda8-0000000008a9 46400 1727204535.67650: variable 'ansible_search_path' from source: unknown 46400 1727204535.67654: variable 'ansible_search_path' from source: unknown 46400 1727204535.67893: calling self._execute() 46400 1727204535.67985: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.67991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.68001: variable 'omit' from source: magic vars 46400 1727204535.68797: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.68810: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.68952: variable 'omit' from source: magic vars 46400 1727204535.69054: variable 'omit' from source: magic vars 46400 1727204535.69469: variable 'interface' from source: play vars 46400 1727204535.69585: variable 'omit' from source: magic vars 46400 1727204535.69626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204535.69666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204535.69690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204535.69707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.69719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.69750: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204535.69754: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.69757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.69851: Set connection var ansible_shell_type to sh 46400 1727204535.69862: Set connection var ansible_shell_executable to /bin/sh 46400 1727204535.69867: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204535.69872: Set connection var ansible_connection to ssh 46400 1727204535.69877: Set connection var ansible_pipelining to False 46400 1727204535.69883: Set connection var ansible_timeout to 10 46400 1727204535.69906: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.69910: variable 'ansible_connection' from source: unknown 46400 1727204535.69913: variable 'ansible_module_compression' from source: unknown 46400 1727204535.69916: variable 'ansible_shell_type' from source: unknown 46400 1727204535.69919: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.69921: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.69924: variable 'ansible_pipelining' from source: unknown 46400 1727204535.69926: variable 'ansible_timeout' from source: unknown 46400 1727204535.69928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.70274: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204535.70286: variable 'omit' from source: magic vars 46400 1727204535.70289: starting attempt loop 46400 1727204535.70291: running the handler 46400 1727204535.70433: variable 'interface_stat' from source: set_fact 46400 1727204535.70443: Evaluated conditional (not interface_stat.stat.exists): True 46400 1727204535.70449: handler run complete 46400 1727204535.70466: attempt loop complete, returning result 46400 1727204535.70470: _execute() done 46400 1727204535.70473: dumping result to json 46400 1727204535.70477: done dumping result, returning 46400 1727204535.70479: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [0affcd87-79f5-1303-fda8-0000000008a9] 46400 1727204535.70484: sending task result for task 0affcd87-79f5-1303-fda8-0000000008a9 46400 1727204535.70581: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008a9 46400 1727204535.70584: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204535.70636: no more pending results, returning what we have 46400 1727204535.70641: results queue empty 46400 1727204535.70642: checking for any_errors_fatal 46400 1727204535.70653: done checking for any_errors_fatal 46400 1727204535.70654: checking for max_fail_percentage 46400 1727204535.70656: done checking for max_fail_percentage 46400 1727204535.70657: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.70657: done checking to see if all hosts have failed 46400 1727204535.70658: getting the remaining hosts for this loop 46400 1727204535.70663: done getting the remaining hosts for this loop 46400 1727204535.70669: getting the next task for host managed-node2 46400 1727204535.70680: done getting next task for host managed-node2 46400 1727204535.70683: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204535.70687: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.70692: getting variables 46400 1727204535.70693: in VariableManager get_vars() 46400 1727204535.70728: Calling all_inventory to load vars for managed-node2 46400 1727204535.70731: Calling groups_inventory to load vars for managed-node2 46400 1727204535.70735: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.70748: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.70752: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.70755: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.73620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.75750: done with get_vars() 46400 1727204535.75786: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.101) 0:00:26.043 ***** 46400 1727204535.75897: entering _queue_task() for managed-node2/include_tasks 46400 1727204535.76258: worker is 1 (out of 1 available) 46400 1727204535.76276: exiting _queue_task() for managed-node2/include_tasks 46400 1727204535.76294: done queuing things up, now waiting for results queue to drain 46400 1727204535.76296: waiting for pending results... 46400 1727204535.76608: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204535.76988: in run() - task 0affcd87-79f5-1303-fda8-0000000008ad 46400 1727204535.77008: variable 'ansible_search_path' from source: unknown 46400 1727204535.77017: variable 'ansible_search_path' from source: unknown 46400 1727204535.77071: calling self._execute() 46400 1727204535.77165: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.77185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.77200: variable 'omit' from source: magic vars 46400 1727204535.77608: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.77630: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.77642: _execute() done 46400 1727204535.77650: dumping result to json 46400 1727204535.77658: done dumping result, returning 46400 1727204535.77674: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-0000000008ad] 46400 1727204535.77686: sending task result for task 0affcd87-79f5-1303-fda8-0000000008ad 46400 1727204535.77829: no more pending results, returning what we have 46400 1727204535.77835: in VariableManager get_vars() 46400 1727204535.77888: Calling all_inventory to load vars for managed-node2 46400 1727204535.77891: Calling groups_inventory to load vars for managed-node2 46400 1727204535.77896: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.77912: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.77915: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.77919: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.79006: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008ad 46400 1727204535.79011: WORKER PROCESS EXITING 46400 1727204535.79910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.82042: done with get_vars() 46400 1727204535.82069: variable 'ansible_search_path' from source: unknown 46400 1727204535.82071: variable 'ansible_search_path' from source: unknown 46400 1727204535.82080: variable 'item' from source: include params 46400 1727204535.82197: variable 'item' from source: include params 46400 1727204535.82229: we have included files to process 46400 1727204535.82231: generating all_blocks data 46400 1727204535.82232: done generating all_blocks data 46400 1727204535.82236: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204535.82237: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204535.82239: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204535.83455: done processing included file 46400 1727204535.83457: iterating over new_blocks loaded from include file 46400 1727204535.83459: in VariableManager get_vars() 46400 1727204535.83483: done with get_vars() 46400 1727204535.83485: filtering new block on tags 46400 1727204535.83573: done filtering new block on tags 46400 1727204535.83577: in VariableManager get_vars() 46400 1727204535.83592: done with get_vars() 46400 1727204535.83594: filtering new block on tags 46400 1727204535.83656: done filtering new block on tags 46400 1727204535.83658: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204535.83668: extending task lists for all hosts with included blocks 46400 1727204535.84030: done extending task lists 46400 1727204535.84031: done processing included files 46400 1727204535.84032: results queue empty 46400 1727204535.84033: checking for any_errors_fatal 46400 1727204535.84036: done checking for any_errors_fatal 46400 1727204535.84037: checking for max_fail_percentage 46400 1727204535.84039: done checking for max_fail_percentage 46400 1727204535.84040: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.84041: done checking to see if all hosts have failed 46400 1727204535.84041: getting the remaining hosts for this loop 46400 1727204535.84043: done getting the remaining hosts for this loop 46400 1727204535.84050: getting the next task for host managed-node2 46400 1727204535.84055: done getting next task for host managed-node2 46400 1727204535.84057: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204535.84063: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.84067: getting variables 46400 1727204535.84068: in VariableManager get_vars() 46400 1727204535.84077: Calling all_inventory to load vars for managed-node2 46400 1727204535.84079: Calling groups_inventory to load vars for managed-node2 46400 1727204535.84082: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.84087: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.84090: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.84092: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.85830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.87879: done with get_vars() 46400 1727204535.87912: done getting variables 46400 1727204535.87959: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.121) 0:00:26.164 ***** 46400 1727204535.88002: entering _queue_task() for managed-node2/set_fact 46400 1727204535.88469: worker is 1 (out of 1 available) 46400 1727204535.88489: exiting _queue_task() for managed-node2/set_fact 46400 1727204535.88503: done queuing things up, now waiting for results queue to drain 46400 1727204535.88504: waiting for pending results... 46400 1727204535.89459: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204535.89889: in run() - task 0affcd87-79f5-1303-fda8-000000000946 46400 1727204535.89894: variable 'ansible_search_path' from source: unknown 46400 1727204535.89897: variable 'ansible_search_path' from source: unknown 46400 1727204535.89931: calling self._execute() 46400 1727204535.90753: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.90758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.90768: variable 'omit' from source: magic vars 46400 1727204535.91140: variable 'ansible_distribution_major_version' from source: facts 46400 1727204535.91153: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204535.91163: variable 'omit' from source: magic vars 46400 1727204535.91630: variable 'omit' from source: magic vars 46400 1727204535.91668: variable 'omit' from source: magic vars 46400 1727204535.91745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204535.91785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204535.91809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204535.91828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.91839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204535.91872: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204535.91875: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.91878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.93079: Set connection var ansible_shell_type to sh 46400 1727204535.93094: Set connection var ansible_shell_executable to /bin/sh 46400 1727204535.93097: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204535.93104: Set connection var ansible_connection to ssh 46400 1727204535.93109: Set connection var ansible_pipelining to False 46400 1727204535.93114: Set connection var ansible_timeout to 10 46400 1727204535.93142: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.93146: variable 'ansible_connection' from source: unknown 46400 1727204535.93149: variable 'ansible_module_compression' from source: unknown 46400 1727204535.93151: variable 'ansible_shell_type' from source: unknown 46400 1727204535.93153: variable 'ansible_shell_executable' from source: unknown 46400 1727204535.93156: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204535.93158: variable 'ansible_pipelining' from source: unknown 46400 1727204535.93166: variable 'ansible_timeout' from source: unknown 46400 1727204535.93169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204535.93313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204535.93323: variable 'omit' from source: magic vars 46400 1727204535.93330: starting attempt loop 46400 1727204535.93333: running the handler 46400 1727204535.93347: handler run complete 46400 1727204535.93357: attempt loop complete, returning result 46400 1727204535.93363: _execute() done 46400 1727204535.93368: dumping result to json 46400 1727204535.93371: done dumping result, returning 46400 1727204535.93373: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-000000000946] 46400 1727204535.93380: sending task result for task 0affcd87-79f5-1303-fda8-000000000946 46400 1727204535.93489: done sending task result for task 0affcd87-79f5-1303-fda8-000000000946 46400 1727204535.93492: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204535.93542: no more pending results, returning what we have 46400 1727204535.93546: results queue empty 46400 1727204535.93547: checking for any_errors_fatal 46400 1727204535.93549: done checking for any_errors_fatal 46400 1727204535.93549: checking for max_fail_percentage 46400 1727204535.93551: done checking for max_fail_percentage 46400 1727204535.93552: checking to see if all hosts have failed and the running result is not ok 46400 1727204535.93553: done checking to see if all hosts have failed 46400 1727204535.93553: getting the remaining hosts for this loop 46400 1727204535.93555: done getting the remaining hosts for this loop 46400 1727204535.93559: getting the next task for host managed-node2 46400 1727204535.93573: done getting next task for host managed-node2 46400 1727204535.93576: ^ task is: TASK: Stat profile file 46400 1727204535.93582: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204535.93586: getting variables 46400 1727204535.93587: in VariableManager get_vars() 46400 1727204535.93619: Calling all_inventory to load vars for managed-node2 46400 1727204535.93621: Calling groups_inventory to load vars for managed-node2 46400 1727204535.93624: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204535.93634: Calling all_plugins_play to load vars for managed-node2 46400 1727204535.93636: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204535.93639: Calling groups_plugins_play to load vars for managed-node2 46400 1727204535.96447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204535.98315: done with get_vars() 46400 1727204535.98916: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:02:15 -0400 (0:00:00.110) 0:00:26.274 ***** 46400 1727204535.99029: entering _queue_task() for managed-node2/stat 46400 1727204535.99390: worker is 1 (out of 1 available) 46400 1727204535.99403: exiting _queue_task() for managed-node2/stat 46400 1727204535.99415: done queuing things up, now waiting for results queue to drain 46400 1727204535.99641: waiting for pending results... 46400 1727204536.01031: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204536.01203: in run() - task 0affcd87-79f5-1303-fda8-000000000947 46400 1727204536.01223: variable 'ansible_search_path' from source: unknown 46400 1727204536.01231: variable 'ansible_search_path' from source: unknown 46400 1727204536.01279: calling self._execute() 46400 1727204536.01384: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.01402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.01416: variable 'omit' from source: magic vars 46400 1727204536.01835: variable 'ansible_distribution_major_version' from source: facts 46400 1727204536.01852: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204536.01867: variable 'omit' from source: magic vars 46400 1727204536.01929: variable 'omit' from source: magic vars 46400 1727204536.02044: variable 'profile' from source: play vars 46400 1727204536.02062: variable 'interface' from source: play vars 46400 1727204536.02132: variable 'interface' from source: play vars 46400 1727204536.02156: variable 'omit' from source: magic vars 46400 1727204536.02215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204536.02257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204536.02296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204536.02318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204536.02333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204536.02372: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204536.02387: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.02396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.02508: Set connection var ansible_shell_type to sh 46400 1727204536.02524: Set connection var ansible_shell_executable to /bin/sh 46400 1727204536.02533: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204536.02542: Set connection var ansible_connection to ssh 46400 1727204536.02551: Set connection var ansible_pipelining to False 46400 1727204536.02563: Set connection var ansible_timeout to 10 46400 1727204536.02599: variable 'ansible_shell_executable' from source: unknown 46400 1727204536.02610: variable 'ansible_connection' from source: unknown 46400 1727204536.02617: variable 'ansible_module_compression' from source: unknown 46400 1727204536.02623: variable 'ansible_shell_type' from source: unknown 46400 1727204536.02629: variable 'ansible_shell_executable' from source: unknown 46400 1727204536.02635: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.02641: variable 'ansible_pipelining' from source: unknown 46400 1727204536.02653: variable 'ansible_timeout' from source: unknown 46400 1727204536.02692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.04458: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204536.04483: variable 'omit' from source: magic vars 46400 1727204536.04494: starting attempt loop 46400 1727204536.04501: running the handler 46400 1727204536.04520: _low_level_execute_command(): starting 46400 1727204536.04530: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204536.05842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.05857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.05879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.05903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.05946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.05959: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.05977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.06001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.06031: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.06046: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.06060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.06078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.06095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.06115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.06132: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.06148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.06230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.06249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.06267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.06460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.08043: stdout chunk (state=3): >>>/root <<< 46400 1727204536.08229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.08232: stdout chunk (state=3): >>><<< 46400 1727204536.08234: stderr chunk (state=3): >>><<< 46400 1727204536.08344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.08347: _low_level_execute_command(): starting 46400 1727204536.08350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517 `" && echo ansible-tmp-1727204536.0825334-48605-136898739367517="` echo /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517 `" ) && sleep 0' 46400 1727204536.09296: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.09305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.09315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.09328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.09369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.09377: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.09388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.09429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.09437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.09444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.09452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.09460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.09479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.09485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.09494: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.09500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.09574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.09592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.09603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.09679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.11513: stdout chunk (state=3): >>>ansible-tmp-1727204536.0825334-48605-136898739367517=/root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517 <<< 46400 1727204536.11703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.11707: stdout chunk (state=3): >>><<< 46400 1727204536.11714: stderr chunk (state=3): >>><<< 46400 1727204536.11733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204536.0825334-48605-136898739367517=/root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.11787: variable 'ansible_module_compression' from source: unknown 46400 1727204536.11849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204536.11891: variable 'ansible_facts' from source: unknown 46400 1727204536.11984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/AnsiballZ_stat.py 46400 1727204536.12125: Sending initial data 46400 1727204536.12129: Sent initial data (153 bytes) 46400 1727204536.13250: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.13257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.13277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.13287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.13339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.13343: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.13380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.13383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.13405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.13409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.13421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.13436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.13446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.13466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.13471: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.13497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.13562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.13582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.13591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.13676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.15382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204536.15420: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204536.15466: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpbif1670i /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/AnsiballZ_stat.py <<< 46400 1727204536.15515: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204536.17107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.17228: stderr chunk (state=3): >>><<< 46400 1727204536.17231: stdout chunk (state=3): >>><<< 46400 1727204536.17234: done transferring module to remote 46400 1727204536.17236: _low_level_execute_command(): starting 46400 1727204536.17244: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/ /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/AnsiballZ_stat.py && sleep 0' 46400 1727204536.17806: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.17821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.17837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.17858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.17902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.17915: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.17929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.17947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.17961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.17978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.17991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.18005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.18020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.18033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.18046: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.18059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.18138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.18155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.18172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.18239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.20041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.20044: stdout chunk (state=3): >>><<< 46400 1727204536.20047: stderr chunk (state=3): >>><<< 46400 1727204536.20142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.20146: _low_level_execute_command(): starting 46400 1727204536.20149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/AnsiballZ_stat.py && sleep 0' 46400 1727204536.20814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.20829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.20848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.20880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.20923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.20940: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.20958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.20983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.20995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.21010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.21022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.21037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.21053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.21073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.21085: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.21100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.21187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.21211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.21226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.21423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.34487: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204536.35573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204536.35724: stderr chunk (state=3): >>><<< 46400 1727204536.35727: stdout chunk (state=3): >>><<< 46400 1727204536.35772: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204536.35885: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204536.35890: _low_level_execute_command(): starting 46400 1727204536.35892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204536.0825334-48605-136898739367517/ > /dev/null 2>&1 && sleep 0' 46400 1727204536.36785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.36855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.36884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.36906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.36967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.36984: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.36999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.37023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.37041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.37053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.37074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.37091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.37111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.37124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.37136: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.37151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.37234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.37253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.37274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.37351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.39271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.39275: stdout chunk (state=3): >>><<< 46400 1727204536.39297: stderr chunk (state=3): >>><<< 46400 1727204536.39373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.39376: handler run complete 46400 1727204536.39379: attempt loop complete, returning result 46400 1727204536.39381: _execute() done 46400 1727204536.39384: dumping result to json 46400 1727204536.39386: done dumping result, returning 46400 1727204536.39388: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-000000000947] 46400 1727204536.39390: sending task result for task 0affcd87-79f5-1303-fda8-000000000947 46400 1727204536.39646: done sending task result for task 0affcd87-79f5-1303-fda8-000000000947 46400 1727204536.39650: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204536.39737: no more pending results, returning what we have 46400 1727204536.39742: results queue empty 46400 1727204536.39743: checking for any_errors_fatal 46400 1727204536.39751: done checking for any_errors_fatal 46400 1727204536.39752: checking for max_fail_percentage 46400 1727204536.39754: done checking for max_fail_percentage 46400 1727204536.39755: checking to see if all hosts have failed and the running result is not ok 46400 1727204536.39756: done checking to see if all hosts have failed 46400 1727204536.39757: getting the remaining hosts for this loop 46400 1727204536.39758: done getting the remaining hosts for this loop 46400 1727204536.39763: getting the next task for host managed-node2 46400 1727204536.39775: done getting next task for host managed-node2 46400 1727204536.39777: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204536.39782: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204536.39787: getting variables 46400 1727204536.39789: in VariableManager get_vars() 46400 1727204536.39827: Calling all_inventory to load vars for managed-node2 46400 1727204536.39830: Calling groups_inventory to load vars for managed-node2 46400 1727204536.39834: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204536.39847: Calling all_plugins_play to load vars for managed-node2 46400 1727204536.39849: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204536.39852: Calling groups_plugins_play to load vars for managed-node2 46400 1727204536.41949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204536.44458: done with get_vars() 46400 1727204536.44487: done getting variables 46400 1727204536.44554: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:02:16 -0400 (0:00:00.455) 0:00:26.730 ***** 46400 1727204536.44596: entering _queue_task() for managed-node2/set_fact 46400 1727204536.44949: worker is 1 (out of 1 available) 46400 1727204536.44963: exiting _queue_task() for managed-node2/set_fact 46400 1727204536.44980: done queuing things up, now waiting for results queue to drain 46400 1727204536.44982: waiting for pending results... 46400 1727204536.45284: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204536.45446: in run() - task 0affcd87-79f5-1303-fda8-000000000948 46400 1727204536.45468: variable 'ansible_search_path' from source: unknown 46400 1727204536.45476: variable 'ansible_search_path' from source: unknown 46400 1727204536.45522: calling self._execute() 46400 1727204536.45626: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.45642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.45661: variable 'omit' from source: magic vars 46400 1727204536.46157: variable 'ansible_distribution_major_version' from source: facts 46400 1727204536.46177: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204536.46317: variable 'profile_stat' from source: set_fact 46400 1727204536.46332: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204536.46339: when evaluation is False, skipping this task 46400 1727204536.46346: _execute() done 46400 1727204536.46353: dumping result to json 46400 1727204536.46359: done dumping result, returning 46400 1727204536.46370: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-000000000948] 46400 1727204536.46382: sending task result for task 0affcd87-79f5-1303-fda8-000000000948 46400 1727204536.46494: done sending task result for task 0affcd87-79f5-1303-fda8-000000000948 46400 1727204536.46502: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204536.46571: no more pending results, returning what we have 46400 1727204536.46575: results queue empty 46400 1727204536.46576: checking for any_errors_fatal 46400 1727204536.46587: done checking for any_errors_fatal 46400 1727204536.46588: checking for max_fail_percentage 46400 1727204536.46589: done checking for max_fail_percentage 46400 1727204536.46590: checking to see if all hosts have failed and the running result is not ok 46400 1727204536.46591: done checking to see if all hosts have failed 46400 1727204536.46591: getting the remaining hosts for this loop 46400 1727204536.46593: done getting the remaining hosts for this loop 46400 1727204536.46597: getting the next task for host managed-node2 46400 1727204536.46606: done getting next task for host managed-node2 46400 1727204536.46608: ^ task is: TASK: Get NM profile info 46400 1727204536.46613: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204536.46616: getting variables 46400 1727204536.46618: in VariableManager get_vars() 46400 1727204536.46651: Calling all_inventory to load vars for managed-node2 46400 1727204536.46653: Calling groups_inventory to load vars for managed-node2 46400 1727204536.46657: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204536.46672: Calling all_plugins_play to load vars for managed-node2 46400 1727204536.46674: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204536.46677: Calling groups_plugins_play to load vars for managed-node2 46400 1727204536.49809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204536.51602: done with get_vars() 46400 1727204536.51642: done getting variables 46400 1727204536.51715: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:02:16 -0400 (0:00:00.071) 0:00:26.802 ***** 46400 1727204536.51766: entering _queue_task() for managed-node2/shell 46400 1727204536.52150: worker is 1 (out of 1 available) 46400 1727204536.52172: exiting _queue_task() for managed-node2/shell 46400 1727204536.52185: done queuing things up, now waiting for results queue to drain 46400 1727204536.52187: waiting for pending results... 46400 1727204536.52478: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204536.52633: in run() - task 0affcd87-79f5-1303-fda8-000000000949 46400 1727204536.52651: variable 'ansible_search_path' from source: unknown 46400 1727204536.52658: variable 'ansible_search_path' from source: unknown 46400 1727204536.52700: calling self._execute() 46400 1727204536.52803: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.52823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.52848: variable 'omit' from source: magic vars 46400 1727204536.53305: variable 'ansible_distribution_major_version' from source: facts 46400 1727204536.53324: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204536.53336: variable 'omit' from source: magic vars 46400 1727204536.53413: variable 'omit' from source: magic vars 46400 1727204536.53524: variable 'profile' from source: play vars 46400 1727204536.53534: variable 'interface' from source: play vars 46400 1727204536.53614: variable 'interface' from source: play vars 46400 1727204536.53636: variable 'omit' from source: magic vars 46400 1727204536.53691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204536.53738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204536.53762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204536.53787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204536.53809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204536.53844: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204536.53852: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.53859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.53970: Set connection var ansible_shell_type to sh 46400 1727204536.53985: Set connection var ansible_shell_executable to /bin/sh 46400 1727204536.53994: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204536.54003: Set connection var ansible_connection to ssh 46400 1727204536.54016: Set connection var ansible_pipelining to False 46400 1727204536.54031: Set connection var ansible_timeout to 10 46400 1727204536.54062: variable 'ansible_shell_executable' from source: unknown 46400 1727204536.54072: variable 'ansible_connection' from source: unknown 46400 1727204536.54078: variable 'ansible_module_compression' from source: unknown 46400 1727204536.54085: variable 'ansible_shell_type' from source: unknown 46400 1727204536.54092: variable 'ansible_shell_executable' from source: unknown 46400 1727204536.54098: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204536.54104: variable 'ansible_pipelining' from source: unknown 46400 1727204536.54111: variable 'ansible_timeout' from source: unknown 46400 1727204536.54122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204536.54277: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204536.54293: variable 'omit' from source: magic vars 46400 1727204536.54301: starting attempt loop 46400 1727204536.54307: running the handler 46400 1727204536.54321: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204536.54353: _low_level_execute_command(): starting 46400 1727204536.54371: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204536.55166: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.55183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.55197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.55219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.55272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.55285: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.55298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.55319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.55331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.55341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.55358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.55375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.55390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.55402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.55412: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.55429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.55513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.55539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.55557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.55640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.57281: stdout chunk (state=3): >>>/root <<< 46400 1727204536.57481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.57484: stdout chunk (state=3): >>><<< 46400 1727204536.57486: stderr chunk (state=3): >>><<< 46400 1727204536.57571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.57586: _low_level_execute_command(): starting 46400 1727204536.57589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467 `" && echo ansible-tmp-1727204536.575076-48671-239777104505467="` echo /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467 `" ) && sleep 0' 46400 1727204536.58412: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.58421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.58431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.58446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.58488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.58495: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.58505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.58519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.58527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.58533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.58542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.58551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.58571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.58578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.58585: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.58593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.58675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.58699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.58719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.58796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.60642: stdout chunk (state=3): >>>ansible-tmp-1727204536.575076-48671-239777104505467=/root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467 <<< 46400 1727204536.60822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.60826: stdout chunk (state=3): >>><<< 46400 1727204536.60833: stderr chunk (state=3): >>><<< 46400 1727204536.60855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204536.575076-48671-239777104505467=/root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.60891: variable 'ansible_module_compression' from source: unknown 46400 1727204536.60946: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204536.60989: variable 'ansible_facts' from source: unknown 46400 1727204536.61096: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/AnsiballZ_command.py 46400 1727204536.61505: Sending initial data 46400 1727204536.61509: Sent initial data (155 bytes) 46400 1727204536.63221: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.63230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.63241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.63254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.63297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.63304: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.63313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.63326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.63333: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.63340: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.63348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.63357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.63376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.63383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.63389: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.63398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.63491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.63502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.63505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.63572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.65269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204536.65305: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204536.65345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpdquksgf2 /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/AnsiballZ_command.py <<< 46400 1727204536.65382: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204536.66785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.66970: stderr chunk (state=3): >>><<< 46400 1727204536.66974: stdout chunk (state=3): >>><<< 46400 1727204536.66976: done transferring module to remote 46400 1727204536.66978: _low_level_execute_command(): starting 46400 1727204536.66981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/ /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/AnsiballZ_command.py && sleep 0' 46400 1727204536.68698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.68785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.68801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.68819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.68982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.68995: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.69009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.69026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.69038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.69053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.69068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.69083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.69099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.69111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.69121: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.69134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.69214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.69284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.69300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.69500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.71189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.71273: stderr chunk (state=3): >>><<< 46400 1727204536.71277: stdout chunk (state=3): >>><<< 46400 1727204536.71377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.71381: _low_level_execute_command(): starting 46400 1727204536.71384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/AnsiballZ_command.py && sleep 0' 46400 1727204536.72859: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.73013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.73030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.73050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.73098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.73116: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.73131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.73150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.73163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.73178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.73191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.73206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.73227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.73241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.73254: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.73270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.73460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.73480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.73495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.73660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.88532: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:16.866561", "end": "2024-09-24 15:02:16.884351", "delta": "0:00:00.017790", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204536.89687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204536.89799: stderr chunk (state=3): >>><<< 46400 1727204536.89803: stdout chunk (state=3): >>><<< 46400 1727204536.89944: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:16.866561", "end": "2024-09-24 15:02:16.884351", "delta": "0:00:00.017790", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204536.89954: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204536.89957: _low_level_execute_command(): starting 46400 1727204536.89959: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204536.575076-48671-239777104505467/ > /dev/null 2>&1 && sleep 0' 46400 1727204536.90787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204536.90808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.90831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.90851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.90897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.90909: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204536.90931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.90951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204536.90963: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204536.90977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204536.90987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204536.90997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204536.91009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204536.91019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204536.91033: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204536.91053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204536.91127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204536.91151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204536.91173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204536.91245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204536.93043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204536.93183: stderr chunk (state=3): >>><<< 46400 1727204536.93186: stdout chunk (state=3): >>><<< 46400 1727204536.93220: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204536.93228: handler run complete 46400 1727204536.93262: Evaluated conditional (False): False 46400 1727204536.93278: attempt loop complete, returning result 46400 1727204536.93281: _execute() done 46400 1727204536.93283: dumping result to json 46400 1727204536.93285: done dumping result, returning 46400 1727204536.93295: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-000000000949] 46400 1727204536.93301: sending task result for task 0affcd87-79f5-1303-fda8-000000000949 46400 1727204536.93410: done sending task result for task 0affcd87-79f5-1303-fda8-000000000949 46400 1727204536.93412: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017790", "end": "2024-09-24 15:02:16.884351", "rc": 0, "start": "2024-09-24 15:02:16.866561" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 46400 1727204536.93506: no more pending results, returning what we have 46400 1727204536.93511: results queue empty 46400 1727204536.93514: checking for any_errors_fatal 46400 1727204536.93519: done checking for any_errors_fatal 46400 1727204536.93520: checking for max_fail_percentage 46400 1727204536.93525: done checking for max_fail_percentage 46400 1727204536.93526: checking to see if all hosts have failed and the running result is not ok 46400 1727204536.93527: done checking to see if all hosts have failed 46400 1727204536.93527: getting the remaining hosts for this loop 46400 1727204536.93529: done getting the remaining hosts for this loop 46400 1727204536.93533: getting the next task for host managed-node2 46400 1727204536.93547: done getting next task for host managed-node2 46400 1727204536.93551: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204536.93556: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204536.93560: getting variables 46400 1727204536.93562: in VariableManager get_vars() 46400 1727204536.93599: Calling all_inventory to load vars for managed-node2 46400 1727204536.93604: Calling groups_inventory to load vars for managed-node2 46400 1727204536.93609: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204536.93630: Calling all_plugins_play to load vars for managed-node2 46400 1727204536.93633: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204536.93639: Calling groups_plugins_play to load vars for managed-node2 46400 1727204536.97112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.01548: done with get_vars() 46400 1727204537.01770: done getting variables 46400 1727204537.01837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.501) 0:00:27.303 ***** 46400 1727204537.01885: entering _queue_task() for managed-node2/set_fact 46400 1727204537.02796: worker is 1 (out of 1 available) 46400 1727204537.02808: exiting _queue_task() for managed-node2/set_fact 46400 1727204537.02823: done queuing things up, now waiting for results queue to drain 46400 1727204537.02824: waiting for pending results... 46400 1727204537.03209: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204537.03376: in run() - task 0affcd87-79f5-1303-fda8-00000000094a 46400 1727204537.03403: variable 'ansible_search_path' from source: unknown 46400 1727204537.03413: variable 'ansible_search_path' from source: unknown 46400 1727204537.03476: calling self._execute() 46400 1727204537.03579: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.03594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.03611: variable 'omit' from source: magic vars 46400 1727204537.04240: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.04285: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.04427: variable 'nm_profile_exists' from source: set_fact 46400 1727204537.04470: Evaluated conditional (nm_profile_exists.rc == 0): True 46400 1727204537.04482: variable 'omit' from source: magic vars 46400 1727204537.04541: variable 'omit' from source: magic vars 46400 1727204537.04586: variable 'omit' from source: magic vars 46400 1727204537.04650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.04716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.04776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.04806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.04841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.04933: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.04946: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.04976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.05139: Set connection var ansible_shell_type to sh 46400 1727204537.05177: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.05192: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.05201: Set connection var ansible_connection to ssh 46400 1727204537.05209: Set connection var ansible_pipelining to False 46400 1727204537.05225: Set connection var ansible_timeout to 10 46400 1727204537.05292: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.05300: variable 'ansible_connection' from source: unknown 46400 1727204537.05306: variable 'ansible_module_compression' from source: unknown 46400 1727204537.05316: variable 'ansible_shell_type' from source: unknown 46400 1727204537.05338: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.05347: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.05357: variable 'ansible_pipelining' from source: unknown 46400 1727204537.05414: variable 'ansible_timeout' from source: unknown 46400 1727204537.05424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.05686: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.05707: variable 'omit' from source: magic vars 46400 1727204537.05763: starting attempt loop 46400 1727204537.05773: running the handler 46400 1727204537.05791: handler run complete 46400 1727204537.05806: attempt loop complete, returning result 46400 1727204537.05813: _execute() done 46400 1727204537.05828: dumping result to json 46400 1727204537.05851: done dumping result, returning 46400 1727204537.05867: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-00000000094a] 46400 1727204537.05895: sending task result for task 0affcd87-79f5-1303-fda8-00000000094a ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 46400 1727204537.06149: no more pending results, returning what we have 46400 1727204537.06154: results queue empty 46400 1727204537.06155: checking for any_errors_fatal 46400 1727204537.06171: done checking for any_errors_fatal 46400 1727204537.06172: checking for max_fail_percentage 46400 1727204537.06174: done checking for max_fail_percentage 46400 1727204537.06175: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.06176: done checking to see if all hosts have failed 46400 1727204537.06177: getting the remaining hosts for this loop 46400 1727204537.06179: done getting the remaining hosts for this loop 46400 1727204537.06183: getting the next task for host managed-node2 46400 1727204537.06195: done getting next task for host managed-node2 46400 1727204537.06198: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204537.06203: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.06210: getting variables 46400 1727204537.06212: in VariableManager get_vars() 46400 1727204537.06251: Calling all_inventory to load vars for managed-node2 46400 1727204537.06254: Calling groups_inventory to load vars for managed-node2 46400 1727204537.06258: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.06289: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.06293: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.06296: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.07400: done sending task result for task 0affcd87-79f5-1303-fda8-00000000094a 46400 1727204537.07404: WORKER PROCESS EXITING 46400 1727204537.09344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.10757: done with get_vars() 46400 1727204537.10799: done getting variables 46400 1727204537.10865: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.10980: variable 'profile' from source: play vars 46400 1727204537.10986: variable 'interface' from source: play vars 46400 1727204537.11078: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.092) 0:00:27.395 ***** 46400 1727204537.11111: entering _queue_task() for managed-node2/command 46400 1727204537.11900: worker is 1 (out of 1 available) 46400 1727204537.11913: exiting _queue_task() for managed-node2/command 46400 1727204537.11926: done queuing things up, now waiting for results queue to drain 46400 1727204537.11928: waiting for pending results... 46400 1727204537.12213: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204537.12370: in run() - task 0affcd87-79f5-1303-fda8-00000000094c 46400 1727204537.12395: variable 'ansible_search_path' from source: unknown 46400 1727204537.12403: variable 'ansible_search_path' from source: unknown 46400 1727204537.12447: calling self._execute() 46400 1727204537.12565: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.12579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.12598: variable 'omit' from source: magic vars 46400 1727204537.13013: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.13037: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.13186: variable 'profile_stat' from source: set_fact 46400 1727204537.13204: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204537.13211: when evaluation is False, skipping this task 46400 1727204537.13218: _execute() done 46400 1727204537.13226: dumping result to json 46400 1727204537.13233: done dumping result, returning 46400 1727204537.13249: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000094c] 46400 1727204537.13262: sending task result for task 0affcd87-79f5-1303-fda8-00000000094c skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204537.13441: no more pending results, returning what we have 46400 1727204537.13446: results queue empty 46400 1727204537.13447: checking for any_errors_fatal 46400 1727204537.13458: done checking for any_errors_fatal 46400 1727204537.13459: checking for max_fail_percentage 46400 1727204537.13465: done checking for max_fail_percentage 46400 1727204537.13466: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.13467: done checking to see if all hosts have failed 46400 1727204537.13468: getting the remaining hosts for this loop 46400 1727204537.13470: done getting the remaining hosts for this loop 46400 1727204537.13475: getting the next task for host managed-node2 46400 1727204537.13486: done getting next task for host managed-node2 46400 1727204537.13490: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204537.13495: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.13499: getting variables 46400 1727204537.13501: in VariableManager get_vars() 46400 1727204537.13541: Calling all_inventory to load vars for managed-node2 46400 1727204537.13545: Calling groups_inventory to load vars for managed-node2 46400 1727204537.13550: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.13571: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.13575: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.13578: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.14586: done sending task result for task 0affcd87-79f5-1303-fda8-00000000094c 46400 1727204537.14590: WORKER PROCESS EXITING 46400 1727204537.15525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.17657: done with get_vars() 46400 1727204537.17695: done getting variables 46400 1727204537.17758: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.17891: variable 'profile' from source: play vars 46400 1727204537.17895: variable 'interface' from source: play vars 46400 1727204537.17954: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.068) 0:00:27.464 ***** 46400 1727204537.17993: entering _queue_task() for managed-node2/set_fact 46400 1727204537.18351: worker is 1 (out of 1 available) 46400 1727204537.18369: exiting _queue_task() for managed-node2/set_fact 46400 1727204537.18387: done queuing things up, now waiting for results queue to drain 46400 1727204537.18389: waiting for pending results... 46400 1727204537.18690: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204537.18847: in run() - task 0affcd87-79f5-1303-fda8-00000000094d 46400 1727204537.18868: variable 'ansible_search_path' from source: unknown 46400 1727204537.18876: variable 'ansible_search_path' from source: unknown 46400 1727204537.18916: calling self._execute() 46400 1727204537.19015: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.19027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.19040: variable 'omit' from source: magic vars 46400 1727204537.19410: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.19426: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.19560: variable 'profile_stat' from source: set_fact 46400 1727204537.19582: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204537.19593: when evaluation is False, skipping this task 46400 1727204537.19599: _execute() done 46400 1727204537.19606: dumping result to json 46400 1727204537.19612: done dumping result, returning 46400 1727204537.19621: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000094d] 46400 1727204537.19631: sending task result for task 0affcd87-79f5-1303-fda8-00000000094d skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204537.19795: no more pending results, returning what we have 46400 1727204537.19801: results queue empty 46400 1727204537.19802: checking for any_errors_fatal 46400 1727204537.19811: done checking for any_errors_fatal 46400 1727204537.19812: checking for max_fail_percentage 46400 1727204537.19815: done checking for max_fail_percentage 46400 1727204537.19815: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.19816: done checking to see if all hosts have failed 46400 1727204537.19817: getting the remaining hosts for this loop 46400 1727204537.19819: done getting the remaining hosts for this loop 46400 1727204537.19824: getting the next task for host managed-node2 46400 1727204537.19834: done getting next task for host managed-node2 46400 1727204537.19838: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204537.19843: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.19847: getting variables 46400 1727204537.19851: in VariableManager get_vars() 46400 1727204537.19892: Calling all_inventory to load vars for managed-node2 46400 1727204537.19895: Calling groups_inventory to load vars for managed-node2 46400 1727204537.19899: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.19916: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.19918: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.19921: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.21034: done sending task result for task 0affcd87-79f5-1303-fda8-00000000094d 46400 1727204537.21037: WORKER PROCESS EXITING 46400 1727204537.21907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.24284: done with get_vars() 46400 1727204537.24318: done getting variables 46400 1727204537.24389: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.24526: variable 'profile' from source: play vars 46400 1727204537.24531: variable 'interface' from source: play vars 46400 1727204537.24603: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.066) 0:00:27.530 ***** 46400 1727204537.24640: entering _queue_task() for managed-node2/command 46400 1727204537.25015: worker is 1 (out of 1 available) 46400 1727204537.25030: exiting _queue_task() for managed-node2/command 46400 1727204537.25043: done queuing things up, now waiting for results queue to drain 46400 1727204537.25045: waiting for pending results... 46400 1727204537.25352: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204537.25512: in run() - task 0affcd87-79f5-1303-fda8-00000000094e 46400 1727204537.25534: variable 'ansible_search_path' from source: unknown 46400 1727204537.25542: variable 'ansible_search_path' from source: unknown 46400 1727204537.25587: calling self._execute() 46400 1727204537.25794: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.25806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.25820: variable 'omit' from source: magic vars 46400 1727204537.26314: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.26333: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.26534: variable 'profile_stat' from source: set_fact 46400 1727204537.26582: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204537.26590: when evaluation is False, skipping this task 46400 1727204537.26634: _execute() done 46400 1727204537.26643: dumping result to json 46400 1727204537.26650: done dumping result, returning 46400 1727204537.26661: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000094e] 46400 1727204537.26674: sending task result for task 0affcd87-79f5-1303-fda8-00000000094e 46400 1727204537.26834: done sending task result for task 0affcd87-79f5-1303-fda8-00000000094e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204537.26893: no more pending results, returning what we have 46400 1727204537.26899: results queue empty 46400 1727204537.26900: checking for any_errors_fatal 46400 1727204537.26906: done checking for any_errors_fatal 46400 1727204537.26906: checking for max_fail_percentage 46400 1727204537.26908: done checking for max_fail_percentage 46400 1727204537.26909: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.26910: done checking to see if all hosts have failed 46400 1727204537.26910: getting the remaining hosts for this loop 46400 1727204537.26912: done getting the remaining hosts for this loop 46400 1727204537.26916: getting the next task for host managed-node2 46400 1727204537.26924: done getting next task for host managed-node2 46400 1727204537.26927: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204537.26932: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.26937: getting variables 46400 1727204537.26939: in VariableManager get_vars() 46400 1727204537.26976: Calling all_inventory to load vars for managed-node2 46400 1727204537.26979: Calling groups_inventory to load vars for managed-node2 46400 1727204537.26983: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.26997: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.27000: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.27003: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.28026: WORKER PROCESS EXITING 46400 1727204537.34475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.36188: done with get_vars() 46400 1727204537.36226: done getting variables 46400 1727204537.36285: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.36395: variable 'profile' from source: play vars 46400 1727204537.36398: variable 'interface' from source: play vars 46400 1727204537.36457: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.118) 0:00:27.649 ***** 46400 1727204537.36488: entering _queue_task() for managed-node2/set_fact 46400 1727204537.36836: worker is 1 (out of 1 available) 46400 1727204537.36853: exiting _queue_task() for managed-node2/set_fact 46400 1727204537.36867: done queuing things up, now waiting for results queue to drain 46400 1727204537.36869: waiting for pending results... 46400 1727204537.37197: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204537.37371: in run() - task 0affcd87-79f5-1303-fda8-00000000094f 46400 1727204537.37394: variable 'ansible_search_path' from source: unknown 46400 1727204537.37404: variable 'ansible_search_path' from source: unknown 46400 1727204537.37454: calling self._execute() 46400 1727204537.37561: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.37576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.37591: variable 'omit' from source: magic vars 46400 1727204537.37996: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.38014: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.38143: variable 'profile_stat' from source: set_fact 46400 1727204537.38159: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204537.38169: when evaluation is False, skipping this task 46400 1727204537.38176: _execute() done 46400 1727204537.38185: dumping result to json 46400 1727204537.38200: done dumping result, returning 46400 1727204537.38210: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000094f] 46400 1727204537.38221: sending task result for task 0affcd87-79f5-1303-fda8-00000000094f skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204537.38385: no more pending results, returning what we have 46400 1727204537.38389: results queue empty 46400 1727204537.38391: checking for any_errors_fatal 46400 1727204537.38403: done checking for any_errors_fatal 46400 1727204537.38403: checking for max_fail_percentage 46400 1727204537.38406: done checking for max_fail_percentage 46400 1727204537.38406: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.38407: done checking to see if all hosts have failed 46400 1727204537.38408: getting the remaining hosts for this loop 46400 1727204537.38410: done getting the remaining hosts for this loop 46400 1727204537.38414: getting the next task for host managed-node2 46400 1727204537.38426: done getting next task for host managed-node2 46400 1727204537.38429: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 46400 1727204537.38433: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.38441: getting variables 46400 1727204537.38443: in VariableManager get_vars() 46400 1727204537.38484: Calling all_inventory to load vars for managed-node2 46400 1727204537.38487: Calling groups_inventory to load vars for managed-node2 46400 1727204537.38491: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.38505: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.38508: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.38511: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.39513: done sending task result for task 0affcd87-79f5-1303-fda8-00000000094f 46400 1727204537.39517: WORKER PROCESS EXITING 46400 1727204537.40315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.42042: done with get_vars() 46400 1727204537.42075: done getting variables 46400 1727204537.42144: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.42288: variable 'profile' from source: play vars 46400 1727204537.42292: variable 'interface' from source: play vars 46400 1727204537.42359: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.059) 0:00:27.708 ***** 46400 1727204537.42396: entering _queue_task() for managed-node2/assert 46400 1727204537.42749: worker is 1 (out of 1 available) 46400 1727204537.42768: exiting _queue_task() for managed-node2/assert 46400 1727204537.42786: done queuing things up, now waiting for results queue to drain 46400 1727204537.42788: waiting for pending results... 46400 1727204537.43101: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 46400 1727204537.43257: in run() - task 0affcd87-79f5-1303-fda8-0000000008ae 46400 1727204537.43280: variable 'ansible_search_path' from source: unknown 46400 1727204537.43287: variable 'ansible_search_path' from source: unknown 46400 1727204537.43329: calling self._execute() 46400 1727204537.43433: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.43452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.43470: variable 'omit' from source: magic vars 46400 1727204537.43860: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.43883: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.43896: variable 'omit' from source: magic vars 46400 1727204537.43945: variable 'omit' from source: magic vars 46400 1727204537.44055: variable 'profile' from source: play vars 46400 1727204537.44069: variable 'interface' from source: play vars 46400 1727204537.44142: variable 'interface' from source: play vars 46400 1727204537.44177: variable 'omit' from source: magic vars 46400 1727204537.45070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.45114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.45150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.45255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.45274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.45310: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.45348: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.45357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.45586: Set connection var ansible_shell_type to sh 46400 1727204537.45683: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.45694: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.45704: Set connection var ansible_connection to ssh 46400 1727204537.45713: Set connection var ansible_pipelining to False 46400 1727204537.45723: Set connection var ansible_timeout to 10 46400 1727204537.45755: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.45765: variable 'ansible_connection' from source: unknown 46400 1727204537.45890: variable 'ansible_module_compression' from source: unknown 46400 1727204537.45898: variable 'ansible_shell_type' from source: unknown 46400 1727204537.45905: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.45912: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.45919: variable 'ansible_pipelining' from source: unknown 46400 1727204537.45926: variable 'ansible_timeout' from source: unknown 46400 1727204537.45934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.46086: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.46227: variable 'omit' from source: magic vars 46400 1727204537.46238: starting attempt loop 46400 1727204537.46246: running the handler 46400 1727204537.46491: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204537.46546: Evaluated conditional (lsr_net_profile_exists): True 46400 1727204537.46557: handler run complete 46400 1727204537.46580: attempt loop complete, returning result 46400 1727204537.46651: _execute() done 46400 1727204537.46658: dumping result to json 46400 1727204537.46666: done dumping result, returning 46400 1727204537.46676: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [0affcd87-79f5-1303-fda8-0000000008ae] 46400 1727204537.46685: sending task result for task 0affcd87-79f5-1303-fda8-0000000008ae ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204537.46843: no more pending results, returning what we have 46400 1727204537.46848: results queue empty 46400 1727204537.46850: checking for any_errors_fatal 46400 1727204537.46857: done checking for any_errors_fatal 46400 1727204537.46858: checking for max_fail_percentage 46400 1727204537.46860: done checking for max_fail_percentage 46400 1727204537.46861: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.46862: done checking to see if all hosts have failed 46400 1727204537.46862: getting the remaining hosts for this loop 46400 1727204537.46870: done getting the remaining hosts for this loop 46400 1727204537.46875: getting the next task for host managed-node2 46400 1727204537.46885: done getting next task for host managed-node2 46400 1727204537.46889: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 46400 1727204537.46893: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.46897: getting variables 46400 1727204537.46899: in VariableManager get_vars() 46400 1727204537.46937: Calling all_inventory to load vars for managed-node2 46400 1727204537.46941: Calling groups_inventory to load vars for managed-node2 46400 1727204537.46945: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.46959: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.46962: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.46967: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.48273: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008ae 46400 1727204537.48276: WORKER PROCESS EXITING 46400 1727204537.50188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.53802: done with get_vars() 46400 1727204537.53833: done getting variables 46400 1727204537.53893: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.54243: variable 'profile' from source: play vars 46400 1727204537.54247: variable 'interface' from source: play vars 46400 1727204537.54311: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.120) 0:00:27.829 ***** 46400 1727204537.54463: entering _queue_task() for managed-node2/assert 46400 1727204537.55173: worker is 1 (out of 1 available) 46400 1727204537.55186: exiting _queue_task() for managed-node2/assert 46400 1727204537.55199: done queuing things up, now waiting for results queue to drain 46400 1727204537.55201: waiting for pending results... 46400 1727204537.56218: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 46400 1727204537.56496: in run() - task 0affcd87-79f5-1303-fda8-0000000008af 46400 1727204537.56510: variable 'ansible_search_path' from source: unknown 46400 1727204537.56513: variable 'ansible_search_path' from source: unknown 46400 1727204537.56636: calling self._execute() 46400 1727204537.56894: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.56899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.56909: variable 'omit' from source: magic vars 46400 1727204537.57551: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.57567: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.57574: variable 'omit' from source: magic vars 46400 1727204537.57624: variable 'omit' from source: magic vars 46400 1727204537.57731: variable 'profile' from source: play vars 46400 1727204537.57737: variable 'interface' from source: play vars 46400 1727204537.57806: variable 'interface' from source: play vars 46400 1727204537.57823: variable 'omit' from source: magic vars 46400 1727204537.57874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.57910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.57931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.57949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.57971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.58001: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.58004: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.58007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.58147: Set connection var ansible_shell_type to sh 46400 1727204537.58158: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.58169: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.58174: Set connection var ansible_connection to ssh 46400 1727204537.58184: Set connection var ansible_pipelining to False 46400 1727204537.58190: Set connection var ansible_timeout to 10 46400 1727204537.58215: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.58219: variable 'ansible_connection' from source: unknown 46400 1727204537.58221: variable 'ansible_module_compression' from source: unknown 46400 1727204537.58223: variable 'ansible_shell_type' from source: unknown 46400 1727204537.58226: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.58228: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.58230: variable 'ansible_pipelining' from source: unknown 46400 1727204537.58233: variable 'ansible_timeout' from source: unknown 46400 1727204537.58237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.58378: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.58389: variable 'omit' from source: magic vars 46400 1727204537.58394: starting attempt loop 46400 1727204537.58402: running the handler 46400 1727204537.58516: variable 'lsr_net_profile_ansible_managed' from source: set_fact 46400 1727204537.58519: Evaluated conditional (lsr_net_profile_ansible_managed): True 46400 1727204537.58527: handler run complete 46400 1727204537.58540: attempt loop complete, returning result 46400 1727204537.58543: _execute() done 46400 1727204537.58546: dumping result to json 46400 1727204537.58549: done dumping result, returning 46400 1727204537.58554: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcd87-79f5-1303-fda8-0000000008af] 46400 1727204537.58560: sending task result for task 0affcd87-79f5-1303-fda8-0000000008af 46400 1727204537.58652: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008af 46400 1727204537.58654: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204537.58701: no more pending results, returning what we have 46400 1727204537.58705: results queue empty 46400 1727204537.58706: checking for any_errors_fatal 46400 1727204537.58714: done checking for any_errors_fatal 46400 1727204537.58715: checking for max_fail_percentage 46400 1727204537.58717: done checking for max_fail_percentage 46400 1727204537.58717: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.58718: done checking to see if all hosts have failed 46400 1727204537.58719: getting the remaining hosts for this loop 46400 1727204537.58721: done getting the remaining hosts for this loop 46400 1727204537.58725: getting the next task for host managed-node2 46400 1727204537.58733: done getting next task for host managed-node2 46400 1727204537.58735: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 46400 1727204537.58738: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.58742: getting variables 46400 1727204537.58744: in VariableManager get_vars() 46400 1727204537.58783: Calling all_inventory to load vars for managed-node2 46400 1727204537.58786: Calling groups_inventory to load vars for managed-node2 46400 1727204537.58790: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.58801: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.58804: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.58807: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.60812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.63055: done with get_vars() 46400 1727204537.63084: done getting variables 46400 1727204537.63186: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.63427: variable 'profile' from source: play vars 46400 1727204537.63431: variable 'interface' from source: play vars 46400 1727204537.63614: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.091) 0:00:27.921 ***** 46400 1727204537.63651: entering _queue_task() for managed-node2/assert 46400 1727204537.64333: worker is 1 (out of 1 available) 46400 1727204537.64346: exiting _queue_task() for managed-node2/assert 46400 1727204537.64356: done queuing things up, now waiting for results queue to drain 46400 1727204537.64358: waiting for pending results... 46400 1727204537.64646: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 46400 1727204537.64787: in run() - task 0affcd87-79f5-1303-fda8-0000000008b0 46400 1727204537.64809: variable 'ansible_search_path' from source: unknown 46400 1727204537.64816: variable 'ansible_search_path' from source: unknown 46400 1727204537.64856: calling self._execute() 46400 1727204537.64960: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.64976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.64994: variable 'omit' from source: magic vars 46400 1727204537.65368: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.65386: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.65396: variable 'omit' from source: magic vars 46400 1727204537.65459: variable 'omit' from source: magic vars 46400 1727204537.65573: variable 'profile' from source: play vars 46400 1727204537.65584: variable 'interface' from source: play vars 46400 1727204537.65654: variable 'interface' from source: play vars 46400 1727204537.65683: variable 'omit' from source: magic vars 46400 1727204537.65730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.65779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.65805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.65828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.65845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.65889: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.65898: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.65906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.66012: Set connection var ansible_shell_type to sh 46400 1727204537.66028: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.66040: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.66054: Set connection var ansible_connection to ssh 46400 1727204537.66083: Set connection var ansible_pipelining to False 46400 1727204537.66097: Set connection var ansible_timeout to 10 46400 1727204537.66127: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.66136: variable 'ansible_connection' from source: unknown 46400 1727204537.66144: variable 'ansible_module_compression' from source: unknown 46400 1727204537.66274: variable 'ansible_shell_type' from source: unknown 46400 1727204537.66282: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.66289: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.66296: variable 'ansible_pipelining' from source: unknown 46400 1727204537.66303: variable 'ansible_timeout' from source: unknown 46400 1727204537.66311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.66459: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.66482: variable 'omit' from source: magic vars 46400 1727204537.66494: starting attempt loop 46400 1727204537.66499: running the handler 46400 1727204537.66624: variable 'lsr_net_profile_fingerprint' from source: set_fact 46400 1727204537.66634: Evaluated conditional (lsr_net_profile_fingerprint): True 46400 1727204537.66644: handler run complete 46400 1727204537.66665: attempt loop complete, returning result 46400 1727204537.66673: _execute() done 46400 1727204537.66680: dumping result to json 46400 1727204537.66688: done dumping result, returning 46400 1727204537.66704: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [0affcd87-79f5-1303-fda8-0000000008b0] 46400 1727204537.66716: sending task result for task 0affcd87-79f5-1303-fda8-0000000008b0 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204537.66868: no more pending results, returning what we have 46400 1727204537.66873: results queue empty 46400 1727204537.66875: checking for any_errors_fatal 46400 1727204537.66884: done checking for any_errors_fatal 46400 1727204537.66885: checking for max_fail_percentage 46400 1727204537.66887: done checking for max_fail_percentage 46400 1727204537.66887: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.66888: done checking to see if all hosts have failed 46400 1727204537.66889: getting the remaining hosts for this loop 46400 1727204537.66891: done getting the remaining hosts for this loop 46400 1727204537.66895: getting the next task for host managed-node2 46400 1727204537.66908: done getting next task for host managed-node2 46400 1727204537.66913: ^ task is: TASK: Conditional asserts 46400 1727204537.66915: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.66923: getting variables 46400 1727204537.66925: in VariableManager get_vars() 46400 1727204537.66962: Calling all_inventory to load vars for managed-node2 46400 1727204537.66971: Calling groups_inventory to load vars for managed-node2 46400 1727204537.66976: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.66989: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.66992: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.66995: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.68032: done sending task result for task 0affcd87-79f5-1303-fda8-0000000008b0 46400 1727204537.68036: WORKER PROCESS EXITING 46400 1727204537.69293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.70994: done with get_vars() 46400 1727204537.71023: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.074) 0:00:27.995 ***** 46400 1727204537.71135: entering _queue_task() for managed-node2/include_tasks 46400 1727204537.71514: worker is 1 (out of 1 available) 46400 1727204537.71533: exiting _queue_task() for managed-node2/include_tasks 46400 1727204537.71546: done queuing things up, now waiting for results queue to drain 46400 1727204537.71547: waiting for pending results... 46400 1727204537.71852: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204537.71987: in run() - task 0affcd87-79f5-1303-fda8-0000000005ba 46400 1727204537.72009: variable 'ansible_search_path' from source: unknown 46400 1727204537.72017: variable 'ansible_search_path' from source: unknown 46400 1727204537.72322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204537.74745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204537.74826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204537.74877: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204537.74916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204537.74949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204537.75418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204537.75453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204537.75489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204537.75534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204537.75558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204537.75726: dumping result to json 46400 1727204537.75734: done dumping result, returning 46400 1727204537.75745: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-0000000005ba] 46400 1727204537.75755: sending task result for task 0affcd87-79f5-1303-fda8-0000000005ba skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 46400 1727204537.75926: no more pending results, returning what we have 46400 1727204537.75930: results queue empty 46400 1727204537.75931: checking for any_errors_fatal 46400 1727204537.75937: done checking for any_errors_fatal 46400 1727204537.75937: checking for max_fail_percentage 46400 1727204537.75939: done checking for max_fail_percentage 46400 1727204537.75940: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.75941: done checking to see if all hosts have failed 46400 1727204537.75941: getting the remaining hosts for this loop 46400 1727204537.75943: done getting the remaining hosts for this loop 46400 1727204537.75947: getting the next task for host managed-node2 46400 1727204537.75956: done getting next task for host managed-node2 46400 1727204537.75958: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204537.75961: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.75966: getting variables 46400 1727204537.75968: in VariableManager get_vars() 46400 1727204537.76002: Calling all_inventory to load vars for managed-node2 46400 1727204537.76005: Calling groups_inventory to load vars for managed-node2 46400 1727204537.76008: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.76021: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.76023: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.76026: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.76588: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005ba 46400 1727204537.76597: WORKER PROCESS EXITING 46400 1727204537.78069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.79317: done with get_vars() 46400 1727204537.79337: done getting variables 46400 1727204537.79388: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204537.79485: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.083) 0:00:28.079 ***** 46400 1727204537.79511: entering _queue_task() for managed-node2/debug 46400 1727204537.79757: worker is 1 (out of 1 available) 46400 1727204537.79902: exiting _queue_task() for managed-node2/debug 46400 1727204537.79926: done queuing things up, now waiting for results queue to drain 46400 1727204537.79928: waiting for pending results... 46400 1727204537.80101: running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile without autoconnect' 46400 1727204537.80223: in run() - task 0affcd87-79f5-1303-fda8-0000000005bb 46400 1727204537.80243: variable 'ansible_search_path' from source: unknown 46400 1727204537.80249: variable 'ansible_search_path' from source: unknown 46400 1727204537.80294: calling self._execute() 46400 1727204537.80404: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.80427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.80523: variable 'omit' from source: magic vars 46400 1727204537.81024: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.81058: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.81079: variable 'omit' from source: magic vars 46400 1727204537.81173: variable 'omit' from source: magic vars 46400 1727204537.81301: variable 'lsr_description' from source: include params 46400 1727204537.81324: variable 'omit' from source: magic vars 46400 1727204537.81377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.81420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.81447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.81473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.81495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.81530: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.81538: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.81545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.81651: Set connection var ansible_shell_type to sh 46400 1727204537.81672: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.81685: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.81696: Set connection var ansible_connection to ssh 46400 1727204537.81712: Set connection var ansible_pipelining to False 46400 1727204537.81724: Set connection var ansible_timeout to 10 46400 1727204537.81754: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.81770: variable 'ansible_connection' from source: unknown 46400 1727204537.81779: variable 'ansible_module_compression' from source: unknown 46400 1727204537.81787: variable 'ansible_shell_type' from source: unknown 46400 1727204537.81794: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.81800: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.81808: variable 'ansible_pipelining' from source: unknown 46400 1727204537.81818: variable 'ansible_timeout' from source: unknown 46400 1727204537.81825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.81980: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.81990: variable 'omit' from source: magic vars 46400 1727204537.81995: starting attempt loop 46400 1727204537.81998: running the handler 46400 1727204537.82040: handler run complete 46400 1727204537.82052: attempt loop complete, returning result 46400 1727204537.82055: _execute() done 46400 1727204537.82058: dumping result to json 46400 1727204537.82060: done dumping result, returning 46400 1727204537.82069: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile without autoconnect' [0affcd87-79f5-1303-fda8-0000000005bb] 46400 1727204537.82074: sending task result for task 0affcd87-79f5-1303-fda8-0000000005bb 46400 1727204537.82173: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005bb 46400 1727204537.82176: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 46400 1727204537.82218: no more pending results, returning what we have 46400 1727204537.82224: results queue empty 46400 1727204537.82225: checking for any_errors_fatal 46400 1727204537.82236: done checking for any_errors_fatal 46400 1727204537.82236: checking for max_fail_percentage 46400 1727204537.82238: done checking for max_fail_percentage 46400 1727204537.82239: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.82240: done checking to see if all hosts have failed 46400 1727204537.82240: getting the remaining hosts for this loop 46400 1727204537.82242: done getting the remaining hosts for this loop 46400 1727204537.82246: getting the next task for host managed-node2 46400 1727204537.82255: done getting next task for host managed-node2 46400 1727204537.82258: ^ task is: TASK: Cleanup 46400 1727204537.82260: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.82272: getting variables 46400 1727204537.82274: in VariableManager get_vars() 46400 1727204537.82309: Calling all_inventory to load vars for managed-node2 46400 1727204537.82311: Calling groups_inventory to load vars for managed-node2 46400 1727204537.82314: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.82325: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.82327: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.82329: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.83146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.84068: done with get_vars() 46400 1727204537.84086: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.046) 0:00:28.126 ***** 46400 1727204537.84157: entering _queue_task() for managed-node2/include_tasks 46400 1727204537.84395: worker is 1 (out of 1 available) 46400 1727204537.84410: exiting _queue_task() for managed-node2/include_tasks 46400 1727204537.84423: done queuing things up, now waiting for results queue to drain 46400 1727204537.84425: waiting for pending results... 46400 1727204537.84615: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204537.84692: in run() - task 0affcd87-79f5-1303-fda8-0000000005bf 46400 1727204537.84703: variable 'ansible_search_path' from source: unknown 46400 1727204537.84707: variable 'ansible_search_path' from source: unknown 46400 1727204537.84740: variable 'lsr_cleanup' from source: include params 46400 1727204537.84901: variable 'lsr_cleanup' from source: include params 46400 1727204537.84956: variable 'omit' from source: magic vars 46400 1727204537.85061: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.85071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.85080: variable 'omit' from source: magic vars 46400 1727204537.85252: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.85260: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.85278: variable 'item' from source: unknown 46400 1727204537.85325: variable 'item' from source: unknown 46400 1727204537.85350: variable 'item' from source: unknown 46400 1727204537.85398: variable 'item' from source: unknown 46400 1727204537.85512: dumping result to json 46400 1727204537.85514: done dumping result, returning 46400 1727204537.85518: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-0000000005bf] 46400 1727204537.85520: sending task result for task 0affcd87-79f5-1303-fda8-0000000005bf 46400 1727204537.85554: done sending task result for task 0affcd87-79f5-1303-fda8-0000000005bf 46400 1727204537.85557: WORKER PROCESS EXITING 46400 1727204537.85582: no more pending results, returning what we have 46400 1727204537.85588: in VariableManager get_vars() 46400 1727204537.85627: Calling all_inventory to load vars for managed-node2 46400 1727204537.85630: Calling groups_inventory to load vars for managed-node2 46400 1727204537.85633: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.85646: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.85649: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.85652: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.86578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.87478: done with get_vars() 46400 1727204537.87493: variable 'ansible_search_path' from source: unknown 46400 1727204537.87493: variable 'ansible_search_path' from source: unknown 46400 1727204537.87527: we have included files to process 46400 1727204537.87528: generating all_blocks data 46400 1727204537.87529: done generating all_blocks data 46400 1727204537.87534: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204537.87535: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204537.87536: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204537.87680: done processing included file 46400 1727204537.87682: iterating over new_blocks loaded from include file 46400 1727204537.87683: in VariableManager get_vars() 46400 1727204537.87694: done with get_vars() 46400 1727204537.87695: filtering new block on tags 46400 1727204537.87713: done filtering new block on tags 46400 1727204537.87714: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204537.87717: extending task lists for all hosts with included blocks 46400 1727204537.88536: done extending task lists 46400 1727204537.88537: done processing included files 46400 1727204537.88538: results queue empty 46400 1727204537.88538: checking for any_errors_fatal 46400 1727204537.88541: done checking for any_errors_fatal 46400 1727204537.88542: checking for max_fail_percentage 46400 1727204537.88542: done checking for max_fail_percentage 46400 1727204537.88543: checking to see if all hosts have failed and the running result is not ok 46400 1727204537.88543: done checking to see if all hosts have failed 46400 1727204537.88544: getting the remaining hosts for this loop 46400 1727204537.88545: done getting the remaining hosts for this loop 46400 1727204537.88546: getting the next task for host managed-node2 46400 1727204537.88550: done getting next task for host managed-node2 46400 1727204537.88551: ^ task is: TASK: Cleanup profile and device 46400 1727204537.88553: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204537.88555: getting variables 46400 1727204537.88556: in VariableManager get_vars() 46400 1727204537.88565: Calling all_inventory to load vars for managed-node2 46400 1727204537.88567: Calling groups_inventory to load vars for managed-node2 46400 1727204537.88568: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204537.88573: Calling all_plugins_play to load vars for managed-node2 46400 1727204537.88574: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204537.88576: Calling groups_plugins_play to load vars for managed-node2 46400 1727204537.89276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204537.90241: done with get_vars() 46400 1727204537.90258: done getting variables 46400 1727204537.90295: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:02:17 -0400 (0:00:00.061) 0:00:28.187 ***** 46400 1727204537.90318: entering _queue_task() for managed-node2/shell 46400 1727204537.90574: worker is 1 (out of 1 available) 46400 1727204537.90587: exiting _queue_task() for managed-node2/shell 46400 1727204537.90600: done queuing things up, now waiting for results queue to drain 46400 1727204537.90602: waiting for pending results... 46400 1727204537.90796: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204537.90877: in run() - task 0affcd87-79f5-1303-fda8-0000000009a0 46400 1727204537.90882: variable 'ansible_search_path' from source: unknown 46400 1727204537.90884: variable 'ansible_search_path' from source: unknown 46400 1727204537.90914: calling self._execute() 46400 1727204537.90992: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.90997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.91006: variable 'omit' from source: magic vars 46400 1727204537.91292: variable 'ansible_distribution_major_version' from source: facts 46400 1727204537.91303: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204537.91309: variable 'omit' from source: magic vars 46400 1727204537.91343: variable 'omit' from source: magic vars 46400 1727204537.91451: variable 'interface' from source: play vars 46400 1727204537.91470: variable 'omit' from source: magic vars 46400 1727204537.91507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204537.91537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204537.91555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204537.91573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.91584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204537.91607: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204537.91610: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.91614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.91686: Set connection var ansible_shell_type to sh 46400 1727204537.91695: Set connection var ansible_shell_executable to /bin/sh 46400 1727204537.91698: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204537.91704: Set connection var ansible_connection to ssh 46400 1727204537.91709: Set connection var ansible_pipelining to False 46400 1727204537.91714: Set connection var ansible_timeout to 10 46400 1727204537.91734: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.91737: variable 'ansible_connection' from source: unknown 46400 1727204537.91741: variable 'ansible_module_compression' from source: unknown 46400 1727204537.91743: variable 'ansible_shell_type' from source: unknown 46400 1727204537.91745: variable 'ansible_shell_executable' from source: unknown 46400 1727204537.91748: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204537.91750: variable 'ansible_pipelining' from source: unknown 46400 1727204537.91752: variable 'ansible_timeout' from source: unknown 46400 1727204537.91755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204537.91860: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.91874: variable 'omit' from source: magic vars 46400 1727204537.91879: starting attempt loop 46400 1727204537.91881: running the handler 46400 1727204537.91891: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204537.91907: _low_level_execute_command(): starting 46400 1727204537.91914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204537.92452: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204537.92473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204537.92498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204537.92511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204537.92563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204537.92583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204537.92643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204537.94295: stdout chunk (state=3): >>>/root <<< 46400 1727204537.94397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204537.94460: stderr chunk (state=3): >>><<< 46400 1727204537.94466: stdout chunk (state=3): >>><<< 46400 1727204537.94488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204537.94503: _low_level_execute_command(): starting 46400 1727204537.94511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920 `" && echo ansible-tmp-1727204537.9449003-48734-104426675593920="` echo /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920 `" ) && sleep 0' 46400 1727204537.94996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204537.95009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204537.95029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204537.95047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204537.95099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204537.95111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204537.95158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204537.97018: stdout chunk (state=3): >>>ansible-tmp-1727204537.9449003-48734-104426675593920=/root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920 <<< 46400 1727204537.97203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204537.97208: stderr chunk (state=3): >>><<< 46400 1727204537.97210: stdout chunk (state=3): >>><<< 46400 1727204537.97227: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204537.9449003-48734-104426675593920=/root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204537.97267: variable 'ansible_module_compression' from source: unknown 46400 1727204537.97321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204537.97357: variable 'ansible_facts' from source: unknown 46400 1727204537.97448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/AnsiballZ_command.py 46400 1727204537.97597: Sending initial data 46400 1727204537.97601: Sent initial data (156 bytes) 46400 1727204537.98370: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204537.98378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204537.98434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204537.98437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204537.98440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204537.98442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204537.98445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204537.98489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204537.98500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204537.98550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.00286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204538.00320: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204538.00354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpl86l40ld /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/AnsiballZ_command.py <<< 46400 1727204538.00410: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204538.01533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.01718: stderr chunk (state=3): >>><<< 46400 1727204538.01731: stdout chunk (state=3): >>><<< 46400 1727204538.01847: done transferring module to remote 46400 1727204538.01851: _low_level_execute_command(): starting 46400 1727204538.01853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/ /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/AnsiballZ_command.py && sleep 0' 46400 1727204538.02440: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.02446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.02492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.02496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.02498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.02554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.02569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.02626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.04345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.04421: stderr chunk (state=3): >>><<< 46400 1727204538.04425: stdout chunk (state=3): >>><<< 46400 1727204538.04471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204538.04475: _low_level_execute_command(): starting 46400 1727204538.04478: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/AnsiballZ_command.py && sleep 0' 46400 1727204538.05511: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.05515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.05526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.05539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.05578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.05586: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.05596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.05613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.05620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.05628: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.05635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.05645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.05657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.05667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.05675: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.05684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.05759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.05776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.05784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.05865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.22496: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (01d469d6-102d-4f29-8240-bb96e82c7461) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:18.187804", "end": "2024-09-24 15:02:18.224050", "delta": "0:00:00.036246", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204538.23619: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204538.23679: stderr chunk (state=3): >>><<< 46400 1727204538.23682: stdout chunk (state=3): >>><<< 46400 1727204538.23700: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (01d469d6-102d-4f29-8240-bb96e82c7461) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:18.187804", "end": "2024-09-24 15:02:18.224050", "delta": "0:00:00.036246", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204538.23731: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204538.23739: _low_level_execute_command(): starting 46400 1727204538.23743: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204537.9449003-48734-104426675593920/ > /dev/null 2>&1 && sleep 0' 46400 1727204538.24274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.24281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.24291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.24304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.24342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.24348: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.24359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.24376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.24384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.24391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.24399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.24409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.24420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.24429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.24437: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.24445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.24519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.24533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.24548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.24622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.26421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.26480: stderr chunk (state=3): >>><<< 46400 1727204538.26485: stdout chunk (state=3): >>><<< 46400 1727204538.26504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204538.26510: handler run complete 46400 1727204538.26528: Evaluated conditional (False): False 46400 1727204538.26535: attempt loop complete, returning result 46400 1727204538.26538: _execute() done 46400 1727204538.26541: dumping result to json 46400 1727204538.26546: done dumping result, returning 46400 1727204538.26553: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-0000000009a0] 46400 1727204538.26563: sending task result for task 0affcd87-79f5-1303-fda8-0000000009a0 46400 1727204538.26658: done sending task result for task 0affcd87-79f5-1303-fda8-0000000009a0 46400 1727204538.26663: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.036246", "end": "2024-09-24 15:02:18.224050", "rc": 1, "start": "2024-09-24 15:02:18.187804" } STDOUT: Connection 'statebr' (01d469d6-102d-4f29-8240-bb96e82c7461) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204538.26723: no more pending results, returning what we have 46400 1727204538.26726: results queue empty 46400 1727204538.26727: checking for any_errors_fatal 46400 1727204538.26729: done checking for any_errors_fatal 46400 1727204538.26729: checking for max_fail_percentage 46400 1727204538.26731: done checking for max_fail_percentage 46400 1727204538.26732: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.26732: done checking to see if all hosts have failed 46400 1727204538.26733: getting the remaining hosts for this loop 46400 1727204538.26735: done getting the remaining hosts for this loop 46400 1727204538.26738: getting the next task for host managed-node2 46400 1727204538.26751: done getting next task for host managed-node2 46400 1727204538.26754: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204538.26755: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.26759: getting variables 46400 1727204538.26763: in VariableManager get_vars() 46400 1727204538.26799: Calling all_inventory to load vars for managed-node2 46400 1727204538.26802: Calling groups_inventory to load vars for managed-node2 46400 1727204538.26805: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.26816: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.26818: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.26821: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.27662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.28594: done with get_vars() 46400 1727204538.28613: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.383) 0:00:28.571 ***** 46400 1727204538.28689: entering _queue_task() for managed-node2/include_tasks 46400 1727204538.28935: worker is 1 (out of 1 available) 46400 1727204538.28949: exiting _queue_task() for managed-node2/include_tasks 46400 1727204538.28966: done queuing things up, now waiting for results queue to drain 46400 1727204538.28968: waiting for pending results... 46400 1727204538.29147: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204538.29221: in run() - task 0affcd87-79f5-1303-fda8-000000000011 46400 1727204538.29232: variable 'ansible_search_path' from source: unknown 46400 1727204538.29267: calling self._execute() 46400 1727204538.29341: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.29344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.29354: variable 'omit' from source: magic vars 46400 1727204538.29636: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.29647: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.29653: _execute() done 46400 1727204538.29656: dumping result to json 46400 1727204538.29659: done dumping result, returning 46400 1727204538.29666: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-000000000011] 46400 1727204538.29673: sending task result for task 0affcd87-79f5-1303-fda8-000000000011 46400 1727204538.29777: done sending task result for task 0affcd87-79f5-1303-fda8-000000000011 46400 1727204538.29779: WORKER PROCESS EXITING 46400 1727204538.29808: no more pending results, returning what we have 46400 1727204538.29813: in VariableManager get_vars() 46400 1727204538.29852: Calling all_inventory to load vars for managed-node2 46400 1727204538.29855: Calling groups_inventory to load vars for managed-node2 46400 1727204538.29858: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.29879: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.29883: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.29886: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.30957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.31859: done with get_vars() 46400 1727204538.31878: variable 'ansible_search_path' from source: unknown 46400 1727204538.31889: we have included files to process 46400 1727204538.31890: generating all_blocks data 46400 1727204538.31891: done generating all_blocks data 46400 1727204538.31895: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204538.31895: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204538.31897: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204538.32171: in VariableManager get_vars() 46400 1727204538.32186: done with get_vars() 46400 1727204538.32213: in VariableManager get_vars() 46400 1727204538.32223: done with get_vars() 46400 1727204538.32247: in VariableManager get_vars() 46400 1727204538.32257: done with get_vars() 46400 1727204538.32287: in VariableManager get_vars() 46400 1727204538.32299: done with get_vars() 46400 1727204538.32326: in VariableManager get_vars() 46400 1727204538.32351: done with get_vars() 46400 1727204538.32735: in VariableManager get_vars() 46400 1727204538.32751: done with get_vars() 46400 1727204538.32768: done processing included file 46400 1727204538.32770: iterating over new_blocks loaded from include file 46400 1727204538.32772: in VariableManager get_vars() 46400 1727204538.32784: done with get_vars() 46400 1727204538.32786: filtering new block on tags 46400 1727204538.32890: done filtering new block on tags 46400 1727204538.32893: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204538.32899: extending task lists for all hosts with included blocks 46400 1727204538.32935: done extending task lists 46400 1727204538.32936: done processing included files 46400 1727204538.32937: results queue empty 46400 1727204538.32937: checking for any_errors_fatal 46400 1727204538.32942: done checking for any_errors_fatal 46400 1727204538.32942: checking for max_fail_percentage 46400 1727204538.32944: done checking for max_fail_percentage 46400 1727204538.32944: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.32945: done checking to see if all hosts have failed 46400 1727204538.32946: getting the remaining hosts for this loop 46400 1727204538.32947: done getting the remaining hosts for this loop 46400 1727204538.32950: getting the next task for host managed-node2 46400 1727204538.32953: done getting next task for host managed-node2 46400 1727204538.32956: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204538.32958: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.32963: getting variables 46400 1727204538.32966: in VariableManager get_vars() 46400 1727204538.32974: Calling all_inventory to load vars for managed-node2 46400 1727204538.32976: Calling groups_inventory to load vars for managed-node2 46400 1727204538.32978: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.32983: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.32986: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.32989: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.34137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.35120: done with get_vars() 46400 1727204538.35135: done getting variables 46400 1727204538.35173: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204538.35257: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.065) 0:00:28.637 ***** 46400 1727204538.35286: entering _queue_task() for managed-node2/debug 46400 1727204538.35529: worker is 1 (out of 1 available) 46400 1727204538.35545: exiting _queue_task() for managed-node2/debug 46400 1727204538.35558: done queuing things up, now waiting for results queue to drain 46400 1727204538.35562: waiting for pending results... 46400 1727204538.35739: running TaskExecutor() for managed-node2/TASK: TEST: I can activate an existing profile 46400 1727204538.35844: in run() - task 0affcd87-79f5-1303-fda8-000000000a49 46400 1727204538.35868: variable 'ansible_search_path' from source: unknown 46400 1727204538.35876: variable 'ansible_search_path' from source: unknown 46400 1727204538.35918: calling self._execute() 46400 1727204538.36023: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.36044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.36058: variable 'omit' from source: magic vars 46400 1727204538.36454: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.36487: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.36498: variable 'omit' from source: magic vars 46400 1727204538.36539: variable 'omit' from source: magic vars 46400 1727204538.36656: variable 'lsr_description' from source: include params 46400 1727204538.36685: variable 'omit' from source: magic vars 46400 1727204538.36739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204538.36784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.36820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204538.36843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.36858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.36897: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.36915: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.36925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.37040: Set connection var ansible_shell_type to sh 46400 1727204538.37054: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.37069: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.37079: Set connection var ansible_connection to ssh 46400 1727204538.37088: Set connection var ansible_pipelining to False 46400 1727204538.37097: Set connection var ansible_timeout to 10 46400 1727204538.37129: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.37142: variable 'ansible_connection' from source: unknown 46400 1727204538.37149: variable 'ansible_module_compression' from source: unknown 46400 1727204538.37155: variable 'ansible_shell_type' from source: unknown 46400 1727204538.37167: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.37174: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.37182: variable 'ansible_pipelining' from source: unknown 46400 1727204538.37188: variable 'ansible_timeout' from source: unknown 46400 1727204538.37196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.37352: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.37376: variable 'omit' from source: magic vars 46400 1727204538.37386: starting attempt loop 46400 1727204538.37393: running the handler 46400 1727204538.37441: handler run complete 46400 1727204538.37471: attempt loop complete, returning result 46400 1727204538.37481: _execute() done 46400 1727204538.37488: dumping result to json 46400 1727204538.37494: done dumping result, returning 46400 1727204538.37503: done running TaskExecutor() for managed-node2/TASK: TEST: I can activate an existing profile [0affcd87-79f5-1303-fda8-000000000a49] 46400 1727204538.37513: sending task result for task 0affcd87-79f5-1303-fda8-000000000a49 ok: [managed-node2] => {} MSG: ########## I can activate an existing profile ########## 46400 1727204538.37662: no more pending results, returning what we have 46400 1727204538.37669: results queue empty 46400 1727204538.37670: checking for any_errors_fatal 46400 1727204538.37673: done checking for any_errors_fatal 46400 1727204538.37673: checking for max_fail_percentage 46400 1727204538.37675: done checking for max_fail_percentage 46400 1727204538.37676: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.37677: done checking to see if all hosts have failed 46400 1727204538.37678: getting the remaining hosts for this loop 46400 1727204538.37680: done getting the remaining hosts for this loop 46400 1727204538.37684: getting the next task for host managed-node2 46400 1727204538.37692: done getting next task for host managed-node2 46400 1727204538.37695: ^ task is: TASK: Show item 46400 1727204538.37698: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.37701: getting variables 46400 1727204538.37703: in VariableManager get_vars() 46400 1727204538.37739: Calling all_inventory to load vars for managed-node2 46400 1727204538.37742: Calling groups_inventory to load vars for managed-node2 46400 1727204538.37746: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.37758: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.37765: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.37769: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.38804: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a49 46400 1727204538.38808: WORKER PROCESS EXITING 46400 1727204538.39534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.41342: done with get_vars() 46400 1727204538.41381: done getting variables 46400 1727204538.41451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.061) 0:00:28.699 ***** 46400 1727204538.41488: entering _queue_task() for managed-node2/debug 46400 1727204538.41849: worker is 1 (out of 1 available) 46400 1727204538.41867: exiting _queue_task() for managed-node2/debug 46400 1727204538.41883: done queuing things up, now waiting for results queue to drain 46400 1727204538.41885: waiting for pending results... 46400 1727204538.42169: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204538.42286: in run() - task 0affcd87-79f5-1303-fda8-000000000a4a 46400 1727204538.42303: variable 'ansible_search_path' from source: unknown 46400 1727204538.42318: variable 'ansible_search_path' from source: unknown 46400 1727204538.42378: variable 'omit' from source: magic vars 46400 1727204538.42546: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.42567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.42585: variable 'omit' from source: magic vars 46400 1727204538.42954: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.42982: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.42994: variable 'omit' from source: magic vars 46400 1727204538.43036: variable 'omit' from source: magic vars 46400 1727204538.43096: variable 'item' from source: unknown 46400 1727204538.43172: variable 'item' from source: unknown 46400 1727204538.43202: variable 'omit' from source: magic vars 46400 1727204538.43251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204538.43303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.43331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204538.43353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.43374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.43416: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.43425: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.43433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.43539: Set connection var ansible_shell_type to sh 46400 1727204538.43554: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.43570: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.43581: Set connection var ansible_connection to ssh 46400 1727204538.43590: Set connection var ansible_pipelining to False 46400 1727204538.43601: Set connection var ansible_timeout to 10 46400 1727204538.43635: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.43645: variable 'ansible_connection' from source: unknown 46400 1727204538.43652: variable 'ansible_module_compression' from source: unknown 46400 1727204538.43659: variable 'ansible_shell_type' from source: unknown 46400 1727204538.43672: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.43679: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.43686: variable 'ansible_pipelining' from source: unknown 46400 1727204538.43693: variable 'ansible_timeout' from source: unknown 46400 1727204538.43701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.43867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.43883: variable 'omit' from source: magic vars 46400 1727204538.43892: starting attempt loop 46400 1727204538.43898: running the handler 46400 1727204538.43953: variable 'lsr_description' from source: include params 46400 1727204538.44027: variable 'lsr_description' from source: include params 46400 1727204538.44048: handler run complete 46400 1727204538.44078: attempt loop complete, returning result 46400 1727204538.44097: variable 'item' from source: unknown 46400 1727204538.44177: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 46400 1727204538.44409: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.44423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.44436: variable 'omit' from source: magic vars 46400 1727204538.44613: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.44624: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.44633: variable 'omit' from source: magic vars 46400 1727204538.44651: variable 'omit' from source: magic vars 46400 1727204538.44707: variable 'item' from source: unknown 46400 1727204538.44778: variable 'item' from source: unknown 46400 1727204538.44801: variable 'omit' from source: magic vars 46400 1727204538.44824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.44837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.44851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.44872: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.44880: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.44889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.44969: Set connection var ansible_shell_type to sh 46400 1727204538.44983: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.44993: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.45007: Set connection var ansible_connection to ssh 46400 1727204538.45014: Set connection var ansible_pipelining to False 46400 1727204538.45021: Set connection var ansible_timeout to 10 46400 1727204538.45040: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.45046: variable 'ansible_connection' from source: unknown 46400 1727204538.45051: variable 'ansible_module_compression' from source: unknown 46400 1727204538.45055: variable 'ansible_shell_type' from source: unknown 46400 1727204538.45059: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.45069: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.45075: variable 'ansible_pipelining' from source: unknown 46400 1727204538.45079: variable 'ansible_timeout' from source: unknown 46400 1727204538.45084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.45169: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.45183: variable 'omit' from source: magic vars 46400 1727204538.45191: starting attempt loop 46400 1727204538.45196: running the handler 46400 1727204538.45226: variable 'lsr_setup' from source: include params 46400 1727204538.45303: variable 'lsr_setup' from source: include params 46400 1727204538.45356: handler run complete 46400 1727204538.45377: attempt loop complete, returning result 46400 1727204538.45395: variable 'item' from source: unknown 46400 1727204538.45467: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 46400 1727204538.45653: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.45672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.45687: variable 'omit' from source: magic vars 46400 1727204538.45856: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.45874: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.45883: variable 'omit' from source: magic vars 46400 1727204538.45901: variable 'omit' from source: magic vars 46400 1727204538.45953: variable 'item' from source: unknown 46400 1727204538.46024: variable 'item' from source: unknown 46400 1727204538.46049: variable 'omit' from source: magic vars 46400 1727204538.46079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.46091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.46102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.46117: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.46125: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.46132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.46218: Set connection var ansible_shell_type to sh 46400 1727204538.46231: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.46241: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.46257: Set connection var ansible_connection to ssh 46400 1727204538.46272: Set connection var ansible_pipelining to False 46400 1727204538.46282: Set connection var ansible_timeout to 10 46400 1727204538.46308: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.46316: variable 'ansible_connection' from source: unknown 46400 1727204538.46322: variable 'ansible_module_compression' from source: unknown 46400 1727204538.46328: variable 'ansible_shell_type' from source: unknown 46400 1727204538.46335: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.46341: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.46349: variable 'ansible_pipelining' from source: unknown 46400 1727204538.46367: variable 'ansible_timeout' from source: unknown 46400 1727204538.46376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.46480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.46492: variable 'omit' from source: magic vars 46400 1727204538.46500: starting attempt loop 46400 1727204538.46507: running the handler 46400 1727204538.46531: variable 'lsr_test' from source: include params 46400 1727204538.46605: variable 'lsr_test' from source: include params 46400 1727204538.46629: handler run complete 46400 1727204538.46647: attempt loop complete, returning result 46400 1727204538.46670: variable 'item' from source: unknown 46400 1727204538.46740: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 46400 1727204538.46913: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.46926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.46940: variable 'omit' from source: magic vars 46400 1727204538.47112: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.47122: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.47131: variable 'omit' from source: magic vars 46400 1727204538.47148: variable 'omit' from source: magic vars 46400 1727204538.47201: variable 'item' from source: unknown 46400 1727204538.47270: variable 'item' from source: unknown 46400 1727204538.47297: variable 'omit' from source: magic vars 46400 1727204538.47320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.47332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.47342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.47356: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.47370: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.47377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.47456: Set connection var ansible_shell_type to sh 46400 1727204538.47476: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.47486: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.47495: Set connection var ansible_connection to ssh 46400 1727204538.47511: Set connection var ansible_pipelining to False 46400 1727204538.47523: Set connection var ansible_timeout to 10 46400 1727204538.47547: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.47557: variable 'ansible_connection' from source: unknown 46400 1727204538.47569: variable 'ansible_module_compression' from source: unknown 46400 1727204538.47577: variable 'ansible_shell_type' from source: unknown 46400 1727204538.47584: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.47591: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.47598: variable 'ansible_pipelining' from source: unknown 46400 1727204538.47604: variable 'ansible_timeout' from source: unknown 46400 1727204538.47618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.47720: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.47738: variable 'omit' from source: magic vars 46400 1727204538.47747: starting attempt loop 46400 1727204538.47754: running the handler 46400 1727204538.47783: variable 'lsr_assert' from source: include params 46400 1727204538.47852: variable 'lsr_assert' from source: include params 46400 1727204538.47878: handler run complete 46400 1727204538.47894: attempt loop complete, returning result 46400 1727204538.47910: variable 'item' from source: unknown 46400 1727204538.47980: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 46400 1727204538.48131: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.48144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.48154: variable 'omit' from source: magic vars 46400 1727204538.48381: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.48391: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.48409: variable 'omit' from source: magic vars 46400 1727204538.48426: variable 'omit' from source: magic vars 46400 1727204538.48471: variable 'item' from source: unknown 46400 1727204538.48538: variable 'item' from source: unknown 46400 1727204538.48555: variable 'omit' from source: magic vars 46400 1727204538.48579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.48589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.48597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.48609: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.48622: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.48629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.48704: Set connection var ansible_shell_type to sh 46400 1727204538.48717: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.48734: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.48745: Set connection var ansible_connection to ssh 46400 1727204538.48753: Set connection var ansible_pipelining to False 46400 1727204538.48767: Set connection var ansible_timeout to 10 46400 1727204538.48792: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.48800: variable 'ansible_connection' from source: unknown 46400 1727204538.48806: variable 'ansible_module_compression' from source: unknown 46400 1727204538.48812: variable 'ansible_shell_type' from source: unknown 46400 1727204538.48818: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.48824: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.48833: variable 'ansible_pipelining' from source: unknown 46400 1727204538.48846: variable 'ansible_timeout' from source: unknown 46400 1727204538.48854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.48952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.48971: variable 'omit' from source: magic vars 46400 1727204538.48980: starting attempt loop 46400 1727204538.48987: running the handler 46400 1727204538.49099: handler run complete 46400 1727204538.49116: attempt loop complete, returning result 46400 1727204538.49135: variable 'item' from source: unknown 46400 1727204538.49207: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 46400 1727204538.49366: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.49380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.49393: variable 'omit' from source: magic vars 46400 1727204538.49554: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.49570: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.49578: variable 'omit' from source: magic vars 46400 1727204538.49596: variable 'omit' from source: magic vars 46400 1727204538.49647: variable 'item' from source: unknown 46400 1727204538.49715: variable 'item' from source: unknown 46400 1727204538.49733: variable 'omit' from source: magic vars 46400 1727204538.49766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.49779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.49790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.49805: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.49812: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.49819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.49902: Set connection var ansible_shell_type to sh 46400 1727204538.49915: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.49923: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.49931: Set connection var ansible_connection to ssh 46400 1727204538.49939: Set connection var ansible_pipelining to False 46400 1727204538.49948: Set connection var ansible_timeout to 10 46400 1727204538.49982: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.49990: variable 'ansible_connection' from source: unknown 46400 1727204538.49996: variable 'ansible_module_compression' from source: unknown 46400 1727204538.50002: variable 'ansible_shell_type' from source: unknown 46400 1727204538.50008: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.50014: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.50021: variable 'ansible_pipelining' from source: unknown 46400 1727204538.50027: variable 'ansible_timeout' from source: unknown 46400 1727204538.50034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.50133: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.50145: variable 'omit' from source: magic vars 46400 1727204538.50152: starting attempt loop 46400 1727204538.50158: running the handler 46400 1727204538.50190: variable 'lsr_fail_debug' from source: play vars 46400 1727204538.50254: variable 'lsr_fail_debug' from source: play vars 46400 1727204538.50280: handler run complete 46400 1727204538.50305: attempt loop complete, returning result 46400 1727204538.50324: variable 'item' from source: unknown 46400 1727204538.50390: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204538.50553: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.50572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.50586: variable 'omit' from source: magic vars 46400 1727204538.50717: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.50733: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.50740: variable 'omit' from source: magic vars 46400 1727204538.50756: variable 'omit' from source: magic vars 46400 1727204538.50800: variable 'item' from source: unknown 46400 1727204538.50872: variable 'item' from source: unknown 46400 1727204538.50890: variable 'omit' from source: magic vars 46400 1727204538.50912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.50923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.50933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.50963: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.50974: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.50981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.51053: Set connection var ansible_shell_type to sh 46400 1727204538.51076: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.51087: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.51097: Set connection var ansible_connection to ssh 46400 1727204538.51105: Set connection var ansible_pipelining to False 46400 1727204538.51115: Set connection var ansible_timeout to 10 46400 1727204538.51138: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.51146: variable 'ansible_connection' from source: unknown 46400 1727204538.51152: variable 'ansible_module_compression' from source: unknown 46400 1727204538.51159: variable 'ansible_shell_type' from source: unknown 46400 1727204538.51178: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.51186: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.51194: variable 'ansible_pipelining' from source: unknown 46400 1727204538.51201: variable 'ansible_timeout' from source: unknown 46400 1727204538.51208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.51312: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.51324: variable 'omit' from source: magic vars 46400 1727204538.51333: starting attempt loop 46400 1727204538.51339: running the handler 46400 1727204538.51366: variable 'lsr_cleanup' from source: include params 46400 1727204538.51437: variable 'lsr_cleanup' from source: include params 46400 1727204538.51459: handler run complete 46400 1727204538.51483: attempt loop complete, returning result 46400 1727204538.51510: variable 'item' from source: unknown 46400 1727204538.51576: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 46400 1727204538.51693: dumping result to json 46400 1727204538.51707: done dumping result, returning 46400 1727204538.51719: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-000000000a4a] 46400 1727204538.51731: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4a 46400 1727204538.51878: no more pending results, returning what we have 46400 1727204538.51882: results queue empty 46400 1727204538.51884: checking for any_errors_fatal 46400 1727204538.51890: done checking for any_errors_fatal 46400 1727204538.51891: checking for max_fail_percentage 46400 1727204538.51893: done checking for max_fail_percentage 46400 1727204538.51894: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.51895: done checking to see if all hosts have failed 46400 1727204538.51896: getting the remaining hosts for this loop 46400 1727204538.51898: done getting the remaining hosts for this loop 46400 1727204538.51902: getting the next task for host managed-node2 46400 1727204538.51910: done getting next task for host managed-node2 46400 1727204538.51913: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204538.51916: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.51920: getting variables 46400 1727204538.51922: in VariableManager get_vars() 46400 1727204538.51958: Calling all_inventory to load vars for managed-node2 46400 1727204538.51963: Calling groups_inventory to load vars for managed-node2 46400 1727204538.51969: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.51982: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.51985: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.51988: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.53005: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4a 46400 1727204538.53009: WORKER PROCESS EXITING 46400 1727204538.54091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.55809: done with get_vars() 46400 1727204538.55835: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.144) 0:00:28.843 ***** 46400 1727204538.55932: entering _queue_task() for managed-node2/include_tasks 46400 1727204538.56295: worker is 1 (out of 1 available) 46400 1727204538.56308: exiting _queue_task() for managed-node2/include_tasks 46400 1727204538.56324: done queuing things up, now waiting for results queue to drain 46400 1727204538.56326: waiting for pending results... 46400 1727204538.56628: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204538.56767: in run() - task 0affcd87-79f5-1303-fda8-000000000a4b 46400 1727204538.56792: variable 'ansible_search_path' from source: unknown 46400 1727204538.56800: variable 'ansible_search_path' from source: unknown 46400 1727204538.56843: calling self._execute() 46400 1727204538.56950: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.56974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.56991: variable 'omit' from source: magic vars 46400 1727204538.57386: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.57409: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.57422: _execute() done 46400 1727204538.57433: dumping result to json 46400 1727204538.57440: done dumping result, returning 46400 1727204538.57451: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-000000000a4b] 46400 1727204538.57465: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4b 46400 1727204538.57600: no more pending results, returning what we have 46400 1727204538.57605: in VariableManager get_vars() 46400 1727204538.57649: Calling all_inventory to load vars for managed-node2 46400 1727204538.57652: Calling groups_inventory to load vars for managed-node2 46400 1727204538.57657: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.57676: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.57682: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.57686: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.58816: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4b 46400 1727204538.58820: WORKER PROCESS EXITING 46400 1727204538.59501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.61250: done with get_vars() 46400 1727204538.61278: variable 'ansible_search_path' from source: unknown 46400 1727204538.61280: variable 'ansible_search_path' from source: unknown 46400 1727204538.61318: we have included files to process 46400 1727204538.61319: generating all_blocks data 46400 1727204538.61321: done generating all_blocks data 46400 1727204538.61325: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204538.61326: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204538.61328: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204538.61435: in VariableManager get_vars() 46400 1727204538.61458: done with get_vars() 46400 1727204538.61573: done processing included file 46400 1727204538.61575: iterating over new_blocks loaded from include file 46400 1727204538.61577: in VariableManager get_vars() 46400 1727204538.61590: done with get_vars() 46400 1727204538.61591: filtering new block on tags 46400 1727204538.61622: done filtering new block on tags 46400 1727204538.61625: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204538.61629: extending task lists for all hosts with included blocks 46400 1727204538.62090: done extending task lists 46400 1727204538.62091: done processing included files 46400 1727204538.62092: results queue empty 46400 1727204538.62093: checking for any_errors_fatal 46400 1727204538.62102: done checking for any_errors_fatal 46400 1727204538.62103: checking for max_fail_percentage 46400 1727204538.62104: done checking for max_fail_percentage 46400 1727204538.62104: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.62105: done checking to see if all hosts have failed 46400 1727204538.62106: getting the remaining hosts for this loop 46400 1727204538.62107: done getting the remaining hosts for this loop 46400 1727204538.62109: getting the next task for host managed-node2 46400 1727204538.62113: done getting next task for host managed-node2 46400 1727204538.62115: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204538.62118: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.62120: getting variables 46400 1727204538.62120: in VariableManager get_vars() 46400 1727204538.62129: Calling all_inventory to load vars for managed-node2 46400 1727204538.62132: Calling groups_inventory to load vars for managed-node2 46400 1727204538.62134: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.62140: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.62142: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.62144: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.63518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.65221: done with get_vars() 46400 1727204538.65247: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.094) 0:00:28.937 ***** 46400 1727204538.65339: entering _queue_task() for managed-node2/include_tasks 46400 1727204538.65709: worker is 1 (out of 1 available) 46400 1727204538.65721: exiting _queue_task() for managed-node2/include_tasks 46400 1727204538.65736: done queuing things up, now waiting for results queue to drain 46400 1727204538.65738: waiting for pending results... 46400 1727204538.66040: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204538.66176: in run() - task 0affcd87-79f5-1303-fda8-000000000a72 46400 1727204538.66196: variable 'ansible_search_path' from source: unknown 46400 1727204538.66204: variable 'ansible_search_path' from source: unknown 46400 1727204538.66250: calling self._execute() 46400 1727204538.66359: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.66376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.66391: variable 'omit' from source: magic vars 46400 1727204538.66782: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.66804: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.66815: _execute() done 46400 1727204538.66824: dumping result to json 46400 1727204538.66835: done dumping result, returning 46400 1727204538.66845: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-000000000a72] 46400 1727204538.66856: sending task result for task 0affcd87-79f5-1303-fda8-000000000a72 46400 1727204538.66990: no more pending results, returning what we have 46400 1727204538.66996: in VariableManager get_vars() 46400 1727204538.67039: Calling all_inventory to load vars for managed-node2 46400 1727204538.67042: Calling groups_inventory to load vars for managed-node2 46400 1727204538.67046: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.67066: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.67071: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.67074: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.68187: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a72 46400 1727204538.68191: WORKER PROCESS EXITING 46400 1727204538.68874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.70776: done with get_vars() 46400 1727204538.70797: variable 'ansible_search_path' from source: unknown 46400 1727204538.70799: variable 'ansible_search_path' from source: unknown 46400 1727204538.70842: we have included files to process 46400 1727204538.70844: generating all_blocks data 46400 1727204538.70846: done generating all_blocks data 46400 1727204538.70847: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204538.70848: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204538.70851: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204538.71145: done processing included file 46400 1727204538.71148: iterating over new_blocks loaded from include file 46400 1727204538.71150: in VariableManager get_vars() 46400 1727204538.71172: done with get_vars() 46400 1727204538.71176: filtering new block on tags 46400 1727204538.71215: done filtering new block on tags 46400 1727204538.71218: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204538.71223: extending task lists for all hosts with included blocks 46400 1727204538.71411: done extending task lists 46400 1727204538.71412: done processing included files 46400 1727204538.71413: results queue empty 46400 1727204538.71414: checking for any_errors_fatal 46400 1727204538.71418: done checking for any_errors_fatal 46400 1727204538.71419: checking for max_fail_percentage 46400 1727204538.71420: done checking for max_fail_percentage 46400 1727204538.71421: checking to see if all hosts have failed and the running result is not ok 46400 1727204538.71422: done checking to see if all hosts have failed 46400 1727204538.71423: getting the remaining hosts for this loop 46400 1727204538.71424: done getting the remaining hosts for this loop 46400 1727204538.71427: getting the next task for host managed-node2 46400 1727204538.71431: done getting next task for host managed-node2 46400 1727204538.71433: ^ task is: TASK: Gather current interface info 46400 1727204538.71436: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204538.71438: getting variables 46400 1727204538.71439: in VariableManager get_vars() 46400 1727204538.71449: Calling all_inventory to load vars for managed-node2 46400 1727204538.71451: Calling groups_inventory to load vars for managed-node2 46400 1727204538.71453: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204538.71459: Calling all_plugins_play to load vars for managed-node2 46400 1727204538.71466: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204538.71469: Calling groups_plugins_play to load vars for managed-node2 46400 1727204538.72752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204538.74455: done with get_vars() 46400 1727204538.74484: done getting variables 46400 1727204538.74528: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:02:18 -0400 (0:00:00.092) 0:00:29.030 ***** 46400 1727204538.74573: entering _queue_task() for managed-node2/command 46400 1727204538.74934: worker is 1 (out of 1 available) 46400 1727204538.74947: exiting _queue_task() for managed-node2/command 46400 1727204538.74963: done queuing things up, now waiting for results queue to drain 46400 1727204538.74967: waiting for pending results... 46400 1727204538.75269: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204538.75428: in run() - task 0affcd87-79f5-1303-fda8-000000000aad 46400 1727204538.75447: variable 'ansible_search_path' from source: unknown 46400 1727204538.75455: variable 'ansible_search_path' from source: unknown 46400 1727204538.75499: calling self._execute() 46400 1727204538.75603: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.75617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.75641: variable 'omit' from source: magic vars 46400 1727204538.76031: variable 'ansible_distribution_major_version' from source: facts 46400 1727204538.76051: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204538.76073: variable 'omit' from source: magic vars 46400 1727204538.76131: variable 'omit' from source: magic vars 46400 1727204538.76177: variable 'omit' from source: magic vars 46400 1727204538.76227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204538.76270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204538.76304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204538.76326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.76340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204538.76378: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204538.76391: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.76402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.76515: Set connection var ansible_shell_type to sh 46400 1727204538.76532: Set connection var ansible_shell_executable to /bin/sh 46400 1727204538.76542: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204538.76553: Set connection var ansible_connection to ssh 46400 1727204538.76569: Set connection var ansible_pipelining to False 46400 1727204538.76580: Set connection var ansible_timeout to 10 46400 1727204538.76610: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.76620: variable 'ansible_connection' from source: unknown 46400 1727204538.76626: variable 'ansible_module_compression' from source: unknown 46400 1727204538.76631: variable 'ansible_shell_type' from source: unknown 46400 1727204538.76636: variable 'ansible_shell_executable' from source: unknown 46400 1727204538.76641: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204538.76646: variable 'ansible_pipelining' from source: unknown 46400 1727204538.76650: variable 'ansible_timeout' from source: unknown 46400 1727204538.76655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204538.76797: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204538.76814: variable 'omit' from source: magic vars 46400 1727204538.76826: starting attempt loop 46400 1727204538.76836: running the handler 46400 1727204538.76858: _low_level_execute_command(): starting 46400 1727204538.76876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204538.77707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.77727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.77743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.77769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.77818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.77836: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.77851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.77877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.77890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.77903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.77916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.77939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.77959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.77979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.77993: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.78008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.78098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.78115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.78130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.78213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.79882: stdout chunk (state=3): >>>/root <<< 46400 1727204538.80068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.80072: stdout chunk (state=3): >>><<< 46400 1727204538.80082: stderr chunk (state=3): >>><<< 46400 1727204538.80105: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204538.80119: _low_level_execute_command(): starting 46400 1727204538.80126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213 `" && echo ansible-tmp-1727204538.8010473-48767-271475212917213="` echo /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213 `" ) && sleep 0' 46400 1727204538.80973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.80976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.80979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.80981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.80983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.80985: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.80988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.80989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.80992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.80994: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.80996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.80998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.81007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.81014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.81017: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.81024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.81026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.81028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.81030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.81280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.83003: stdout chunk (state=3): >>>ansible-tmp-1727204538.8010473-48767-271475212917213=/root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213 <<< 46400 1727204538.83198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.83202: stdout chunk (state=3): >>><<< 46400 1727204538.83210: stderr chunk (state=3): >>><<< 46400 1727204538.83229: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204538.8010473-48767-271475212917213=/root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204538.83267: variable 'ansible_module_compression' from source: unknown 46400 1727204538.83322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204538.83358: variable 'ansible_facts' from source: unknown 46400 1727204538.83439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/AnsiballZ_command.py 46400 1727204538.83588: Sending initial data 46400 1727204538.83592: Sent initial data (156 bytes) 46400 1727204538.84554: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.84568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.84578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.84593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.84632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.84640: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.84648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.84662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.84671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.84674: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.84681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.84691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.84701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.84708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.84714: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.84722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.84817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.84826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.84829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.84897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.86598: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204538.86626: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204538.86672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp3y_4k5df /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/AnsiballZ_command.py <<< 46400 1727204538.86714: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204538.87766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.87933: stderr chunk (state=3): >>><<< 46400 1727204538.87937: stdout chunk (state=3): >>><<< 46400 1727204538.87963: done transferring module to remote 46400 1727204538.87973: _low_level_execute_command(): starting 46400 1727204538.87980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/ /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/AnsiballZ_command.py && sleep 0' 46400 1727204538.88675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.88684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.88714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.88758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.88766: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.88780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.88793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.88801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.88809: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.88820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.88831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.88848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.88856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.88866: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.88874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.88952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.88975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.88988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.89057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204538.90767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204538.90890: stderr chunk (state=3): >>><<< 46400 1727204538.90902: stdout chunk (state=3): >>><<< 46400 1727204538.90976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204538.90979: _low_level_execute_command(): starting 46400 1727204538.90982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/AnsiballZ_command.py && sleep 0' 46400 1727204538.91882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204538.91898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.91913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.91931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.91991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.92005: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204538.92019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.92038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204538.92063: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204538.92078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204538.92091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204538.92106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204538.92123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204538.92136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204538.92148: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204538.92180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204538.92256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204538.92290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204538.92309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204538.92392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.05936: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:19.055012", "end": "2024-09-24 15:02:19.058393", "delta": "0:00:00.003381", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204539.07239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204539.07244: stdout chunk (state=3): >>><<< 46400 1727204539.07246: stderr chunk (state=3): >>><<< 46400 1727204539.07376: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:19.055012", "end": "2024-09-24 15:02:19.058393", "delta": "0:00:00.003381", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204539.07386: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204539.07390: _low_level_execute_command(): starting 46400 1727204539.07393: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204538.8010473-48767-271475212917213/ > /dev/null 2>&1 && sleep 0' 46400 1727204539.08570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204539.08589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.08606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.08631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.08684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.08703: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204539.08719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.08744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204539.08758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204539.08776: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204539.08790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.08805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.08822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.08836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.08856: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204539.08878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.08953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204539.08983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.08999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.09097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.10972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204539.10975: stdout chunk (state=3): >>><<< 46400 1727204539.10978: stderr chunk (state=3): >>><<< 46400 1727204539.11124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204539.11128: handler run complete 46400 1727204539.11131: Evaluated conditional (False): False 46400 1727204539.11133: attempt loop complete, returning result 46400 1727204539.11135: _execute() done 46400 1727204539.11137: dumping result to json 46400 1727204539.11138: done dumping result, returning 46400 1727204539.11140: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-000000000aad] 46400 1727204539.11142: sending task result for task 0affcd87-79f5-1303-fda8-000000000aad 46400 1727204539.11215: done sending task result for task 0affcd87-79f5-1303-fda8-000000000aad 46400 1727204539.11219: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003381", "end": "2024-09-24 15:02:19.058393", "rc": 0, "start": "2024-09-24 15:02:19.055012" } STDOUT: bonding_masters eth0 lo 46400 1727204539.11298: no more pending results, returning what we have 46400 1727204539.11302: results queue empty 46400 1727204539.11303: checking for any_errors_fatal 46400 1727204539.11305: done checking for any_errors_fatal 46400 1727204539.11305: checking for max_fail_percentage 46400 1727204539.11307: done checking for max_fail_percentage 46400 1727204539.11308: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.11309: done checking to see if all hosts have failed 46400 1727204539.11310: getting the remaining hosts for this loop 46400 1727204539.11312: done getting the remaining hosts for this loop 46400 1727204539.11316: getting the next task for host managed-node2 46400 1727204539.11324: done getting next task for host managed-node2 46400 1727204539.11327: ^ task is: TASK: Set current_interfaces 46400 1727204539.11331: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.11336: getting variables 46400 1727204539.11337: in VariableManager get_vars() 46400 1727204539.11371: Calling all_inventory to load vars for managed-node2 46400 1727204539.11374: Calling groups_inventory to load vars for managed-node2 46400 1727204539.11378: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.11388: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.11390: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.11397: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.17077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.17982: done with get_vars() 46400 1727204539.18001: done getting variables 46400 1727204539.18036: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.434) 0:00:29.465 ***** 46400 1727204539.18057: entering _queue_task() for managed-node2/set_fact 46400 1727204539.18307: worker is 1 (out of 1 available) 46400 1727204539.18320: exiting _queue_task() for managed-node2/set_fact 46400 1727204539.18335: done queuing things up, now waiting for results queue to drain 46400 1727204539.18337: waiting for pending results... 46400 1727204539.18518: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204539.18610: in run() - task 0affcd87-79f5-1303-fda8-000000000aae 46400 1727204539.18618: variable 'ansible_search_path' from source: unknown 46400 1727204539.18622: variable 'ansible_search_path' from source: unknown 46400 1727204539.18652: calling self._execute() 46400 1727204539.18726: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.18730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.18741: variable 'omit' from source: magic vars 46400 1727204539.19024: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.19036: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.19040: variable 'omit' from source: magic vars 46400 1727204539.19083: variable 'omit' from source: magic vars 46400 1727204539.19166: variable '_current_interfaces' from source: set_fact 46400 1727204539.19212: variable 'omit' from source: magic vars 46400 1727204539.19247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204539.19279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204539.19297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204539.19310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.19318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.19342: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204539.19345: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.19347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.19415: Set connection var ansible_shell_type to sh 46400 1727204539.19423: Set connection var ansible_shell_executable to /bin/sh 46400 1727204539.19429: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204539.19434: Set connection var ansible_connection to ssh 46400 1727204539.19440: Set connection var ansible_pipelining to False 46400 1727204539.19445: Set connection var ansible_timeout to 10 46400 1727204539.19466: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.19469: variable 'ansible_connection' from source: unknown 46400 1727204539.19472: variable 'ansible_module_compression' from source: unknown 46400 1727204539.19476: variable 'ansible_shell_type' from source: unknown 46400 1727204539.19478: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.19481: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.19483: variable 'ansible_pipelining' from source: unknown 46400 1727204539.19486: variable 'ansible_timeout' from source: unknown 46400 1727204539.19488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.19593: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204539.19603: variable 'omit' from source: magic vars 46400 1727204539.19606: starting attempt loop 46400 1727204539.19610: running the handler 46400 1727204539.19619: handler run complete 46400 1727204539.19627: attempt loop complete, returning result 46400 1727204539.19629: _execute() done 46400 1727204539.19631: dumping result to json 46400 1727204539.19635: done dumping result, returning 46400 1727204539.19642: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-000000000aae] 46400 1727204539.19647: sending task result for task 0affcd87-79f5-1303-fda8-000000000aae 46400 1727204539.19751: done sending task result for task 0affcd87-79f5-1303-fda8-000000000aae 46400 1727204539.19754: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204539.19812: no more pending results, returning what we have 46400 1727204539.19820: results queue empty 46400 1727204539.19822: checking for any_errors_fatal 46400 1727204539.19835: done checking for any_errors_fatal 46400 1727204539.19836: checking for max_fail_percentage 46400 1727204539.19838: done checking for max_fail_percentage 46400 1727204539.19839: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.19839: done checking to see if all hosts have failed 46400 1727204539.19840: getting the remaining hosts for this loop 46400 1727204539.19842: done getting the remaining hosts for this loop 46400 1727204539.19846: getting the next task for host managed-node2 46400 1727204539.19855: done getting next task for host managed-node2 46400 1727204539.19858: ^ task is: TASK: Show current_interfaces 46400 1727204539.19865: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.19868: getting variables 46400 1727204539.19870: in VariableManager get_vars() 46400 1727204539.19896: Calling all_inventory to load vars for managed-node2 46400 1727204539.19898: Calling groups_inventory to load vars for managed-node2 46400 1727204539.19901: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.19912: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.19914: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.19916: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.20722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.21642: done with get_vars() 46400 1727204539.21658: done getting variables 46400 1727204539.21701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.036) 0:00:29.501 ***** 46400 1727204539.21723: entering _queue_task() for managed-node2/debug 46400 1727204539.21938: worker is 1 (out of 1 available) 46400 1727204539.21952: exiting _queue_task() for managed-node2/debug 46400 1727204539.21965: done queuing things up, now waiting for results queue to drain 46400 1727204539.21967: waiting for pending results... 46400 1727204539.22141: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204539.22218: in run() - task 0affcd87-79f5-1303-fda8-000000000a73 46400 1727204539.22227: variable 'ansible_search_path' from source: unknown 46400 1727204539.22231: variable 'ansible_search_path' from source: unknown 46400 1727204539.22257: calling self._execute() 46400 1727204539.22329: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.22333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.22341: variable 'omit' from source: magic vars 46400 1727204539.22621: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.22630: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.22637: variable 'omit' from source: magic vars 46400 1727204539.22673: variable 'omit' from source: magic vars 46400 1727204539.22737: variable 'current_interfaces' from source: set_fact 46400 1727204539.22762: variable 'omit' from source: magic vars 46400 1727204539.22797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204539.22824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204539.22842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204539.22863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.22872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.22895: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204539.22898: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.22901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.22968: Set connection var ansible_shell_type to sh 46400 1727204539.22979: Set connection var ansible_shell_executable to /bin/sh 46400 1727204539.22982: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204539.22988: Set connection var ansible_connection to ssh 46400 1727204539.22993: Set connection var ansible_pipelining to False 46400 1727204539.22998: Set connection var ansible_timeout to 10 46400 1727204539.23016: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.23019: variable 'ansible_connection' from source: unknown 46400 1727204539.23022: variable 'ansible_module_compression' from source: unknown 46400 1727204539.23024: variable 'ansible_shell_type' from source: unknown 46400 1727204539.23027: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.23029: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.23031: variable 'ansible_pipelining' from source: unknown 46400 1727204539.23033: variable 'ansible_timeout' from source: unknown 46400 1727204539.23036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.23138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204539.23147: variable 'omit' from source: magic vars 46400 1727204539.23152: starting attempt loop 46400 1727204539.23155: running the handler 46400 1727204539.23194: handler run complete 46400 1727204539.23205: attempt loop complete, returning result 46400 1727204539.23208: _execute() done 46400 1727204539.23211: dumping result to json 46400 1727204539.23213: done dumping result, returning 46400 1727204539.23216: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-000000000a73] 46400 1727204539.23222: sending task result for task 0affcd87-79f5-1303-fda8-000000000a73 46400 1727204539.23307: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a73 46400 1727204539.23314: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204539.23354: no more pending results, returning what we have 46400 1727204539.23358: results queue empty 46400 1727204539.23359: checking for any_errors_fatal 46400 1727204539.23368: done checking for any_errors_fatal 46400 1727204539.23369: checking for max_fail_percentage 46400 1727204539.23370: done checking for max_fail_percentage 46400 1727204539.23371: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.23372: done checking to see if all hosts have failed 46400 1727204539.23372: getting the remaining hosts for this loop 46400 1727204539.23374: done getting the remaining hosts for this loop 46400 1727204539.23378: getting the next task for host managed-node2 46400 1727204539.23386: done getting next task for host managed-node2 46400 1727204539.23389: ^ task is: TASK: Setup 46400 1727204539.23395: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.23399: getting variables 46400 1727204539.23400: in VariableManager get_vars() 46400 1727204539.23432: Calling all_inventory to load vars for managed-node2 46400 1727204539.23435: Calling groups_inventory to load vars for managed-node2 46400 1727204539.23438: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.23447: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.23449: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.23452: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.24356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.25281: done with get_vars() 46400 1727204539.25296: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.036) 0:00:29.538 ***** 46400 1727204539.25366: entering _queue_task() for managed-node2/include_tasks 46400 1727204539.25586: worker is 1 (out of 1 available) 46400 1727204539.25601: exiting _queue_task() for managed-node2/include_tasks 46400 1727204539.25614: done queuing things up, now waiting for results queue to drain 46400 1727204539.25616: waiting for pending results... 46400 1727204539.25789: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204539.25868: in run() - task 0affcd87-79f5-1303-fda8-000000000a4c 46400 1727204539.25879: variable 'ansible_search_path' from source: unknown 46400 1727204539.25882: variable 'ansible_search_path' from source: unknown 46400 1727204539.25919: variable 'lsr_setup' from source: include params 46400 1727204539.26086: variable 'lsr_setup' from source: include params 46400 1727204539.26138: variable 'omit' from source: magic vars 46400 1727204539.26238: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.26246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.26254: variable 'omit' from source: magic vars 46400 1727204539.26417: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.26425: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.26431: variable 'item' from source: unknown 46400 1727204539.26481: variable 'item' from source: unknown 46400 1727204539.26504: variable 'item' from source: unknown 46400 1727204539.26547: variable 'item' from source: unknown 46400 1727204539.26667: dumping result to json 46400 1727204539.26670: done dumping result, returning 46400 1727204539.26672: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-000000000a4c] 46400 1727204539.26675: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4c 46400 1727204539.26713: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4c 46400 1727204539.26716: WORKER PROCESS EXITING 46400 1727204539.26741: no more pending results, returning what we have 46400 1727204539.26746: in VariableManager get_vars() 46400 1727204539.26784: Calling all_inventory to load vars for managed-node2 46400 1727204539.26787: Calling groups_inventory to load vars for managed-node2 46400 1727204539.26790: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.26800: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.26802: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.26805: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.27587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.28605: done with get_vars() 46400 1727204539.28618: variable 'ansible_search_path' from source: unknown 46400 1727204539.28619: variable 'ansible_search_path' from source: unknown 46400 1727204539.28646: we have included files to process 46400 1727204539.28647: generating all_blocks data 46400 1727204539.28648: done generating all_blocks data 46400 1727204539.28651: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204539.28652: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204539.28653: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204539.28843: done processing included file 46400 1727204539.28844: iterating over new_blocks loaded from include file 46400 1727204539.28846: in VariableManager get_vars() 46400 1727204539.28855: done with get_vars() 46400 1727204539.28856: filtering new block on tags 46400 1727204539.28884: done filtering new block on tags 46400 1727204539.28886: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 46400 1727204539.28890: extending task lists for all hosts with included blocks 46400 1727204539.29245: done extending task lists 46400 1727204539.29246: done processing included files 46400 1727204539.29247: results queue empty 46400 1727204539.29247: checking for any_errors_fatal 46400 1727204539.29250: done checking for any_errors_fatal 46400 1727204539.29250: checking for max_fail_percentage 46400 1727204539.29251: done checking for max_fail_percentage 46400 1727204539.29252: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.29252: done checking to see if all hosts have failed 46400 1727204539.29253: getting the remaining hosts for this loop 46400 1727204539.29254: done getting the remaining hosts for this loop 46400 1727204539.29255: getting the next task for host managed-node2 46400 1727204539.29258: done getting next task for host managed-node2 46400 1727204539.29259: ^ task is: TASK: Include network role 46400 1727204539.29262: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.29265: getting variables 46400 1727204539.29266: in VariableManager get_vars() 46400 1727204539.29272: Calling all_inventory to load vars for managed-node2 46400 1727204539.29274: Calling groups_inventory to load vars for managed-node2 46400 1727204539.29275: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.29279: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.29280: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.29282: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.30032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.31426: done with get_vars() 46400 1727204539.31443: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.061) 0:00:29.599 ***** 46400 1727204539.31500: entering _queue_task() for managed-node2/include_role 46400 1727204539.31730: worker is 1 (out of 1 available) 46400 1727204539.31743: exiting _queue_task() for managed-node2/include_role 46400 1727204539.31757: done queuing things up, now waiting for results queue to drain 46400 1727204539.31758: waiting for pending results... 46400 1727204539.31934: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204539.32018: in run() - task 0affcd87-79f5-1303-fda8-000000000ad1 46400 1727204539.32028: variable 'ansible_search_path' from source: unknown 46400 1727204539.32031: variable 'ansible_search_path' from source: unknown 46400 1727204539.32060: calling self._execute() 46400 1727204539.32144: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.32151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.32163: variable 'omit' from source: magic vars 46400 1727204539.32432: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.32443: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.32448: _execute() done 46400 1727204539.32451: dumping result to json 46400 1727204539.32454: done dumping result, returning 46400 1727204539.32462: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000000ad1] 46400 1727204539.32467: sending task result for task 0affcd87-79f5-1303-fda8-000000000ad1 46400 1727204539.32576: done sending task result for task 0affcd87-79f5-1303-fda8-000000000ad1 46400 1727204539.32578: WORKER PROCESS EXITING 46400 1727204539.32606: no more pending results, returning what we have 46400 1727204539.32611: in VariableManager get_vars() 46400 1727204539.32647: Calling all_inventory to load vars for managed-node2 46400 1727204539.32650: Calling groups_inventory to load vars for managed-node2 46400 1727204539.32654: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.32670: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.32673: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.32676: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.34237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.36068: done with get_vars() 46400 1727204539.36090: variable 'ansible_search_path' from source: unknown 46400 1727204539.36092: variable 'ansible_search_path' from source: unknown 46400 1727204539.36307: variable 'omit' from source: magic vars 46400 1727204539.36356: variable 'omit' from source: magic vars 46400 1727204539.36377: variable 'omit' from source: magic vars 46400 1727204539.36381: we have included files to process 46400 1727204539.36382: generating all_blocks data 46400 1727204539.36384: done generating all_blocks data 46400 1727204539.36385: processing included file: fedora.linux_system_roles.network 46400 1727204539.36406: in VariableManager get_vars() 46400 1727204539.36420: done with get_vars() 46400 1727204539.36453: in VariableManager get_vars() 46400 1727204539.36475: done with get_vars() 46400 1727204539.36513: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204539.36645: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204539.36730: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204539.37208: in VariableManager get_vars() 46400 1727204539.37227: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204539.39341: iterating over new_blocks loaded from include file 46400 1727204539.39344: in VariableManager get_vars() 46400 1727204539.39368: done with get_vars() 46400 1727204539.39370: filtering new block on tags 46400 1727204539.39698: done filtering new block on tags 46400 1727204539.39706: in VariableManager get_vars() 46400 1727204539.39721: done with get_vars() 46400 1727204539.39723: filtering new block on tags 46400 1727204539.39740: done filtering new block on tags 46400 1727204539.39742: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204539.39748: extending task lists for all hosts with included blocks 46400 1727204539.39927: done extending task lists 46400 1727204539.39928: done processing included files 46400 1727204539.39929: results queue empty 46400 1727204539.39930: checking for any_errors_fatal 46400 1727204539.39934: done checking for any_errors_fatal 46400 1727204539.39935: checking for max_fail_percentage 46400 1727204539.39936: done checking for max_fail_percentage 46400 1727204539.39937: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.39937: done checking to see if all hosts have failed 46400 1727204539.39938: getting the remaining hosts for this loop 46400 1727204539.39940: done getting the remaining hosts for this loop 46400 1727204539.39942: getting the next task for host managed-node2 46400 1727204539.39947: done getting next task for host managed-node2 46400 1727204539.39950: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204539.39953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.39967: getting variables 46400 1727204539.39968: in VariableManager get_vars() 46400 1727204539.39982: Calling all_inventory to load vars for managed-node2 46400 1727204539.39984: Calling groups_inventory to load vars for managed-node2 46400 1727204539.39986: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.39991: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.39993: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.39996: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.41416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.43147: done with get_vars() 46400 1727204539.43175: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.117) 0:00:29.717 ***** 46400 1727204539.43257: entering _queue_task() for managed-node2/include_tasks 46400 1727204539.43612: worker is 1 (out of 1 available) 46400 1727204539.43629: exiting _queue_task() for managed-node2/include_tasks 46400 1727204539.43644: done queuing things up, now waiting for results queue to drain 46400 1727204539.43645: waiting for pending results... 46400 1727204539.43940: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204539.44120: in run() - task 0affcd87-79f5-1303-fda8-000000000b33 46400 1727204539.44143: variable 'ansible_search_path' from source: unknown 46400 1727204539.44152: variable 'ansible_search_path' from source: unknown 46400 1727204539.44206: calling self._execute() 46400 1727204539.44323: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.44337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.44352: variable 'omit' from source: magic vars 46400 1727204539.44773: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.44791: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.44801: _execute() done 46400 1727204539.44810: dumping result to json 46400 1727204539.44818: done dumping result, returning 46400 1727204539.44830: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000000b33] 46400 1727204539.44848: sending task result for task 0affcd87-79f5-1303-fda8-000000000b33 46400 1727204539.45004: no more pending results, returning what we have 46400 1727204539.45009: in VariableManager get_vars() 46400 1727204539.45056: Calling all_inventory to load vars for managed-node2 46400 1727204539.45063: Calling groups_inventory to load vars for managed-node2 46400 1727204539.45066: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.45084: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.45087: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.45091: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.46106: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b33 46400 1727204539.46110: WORKER PROCESS EXITING 46400 1727204539.46897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.48675: done with get_vars() 46400 1727204539.48705: variable 'ansible_search_path' from source: unknown 46400 1727204539.48707: variable 'ansible_search_path' from source: unknown 46400 1727204539.48756: we have included files to process 46400 1727204539.48758: generating all_blocks data 46400 1727204539.48762: done generating all_blocks data 46400 1727204539.48768: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204539.48769: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204539.48771: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204539.49412: done processing included file 46400 1727204539.49415: iterating over new_blocks loaded from include file 46400 1727204539.49416: in VariableManager get_vars() 46400 1727204539.49442: done with get_vars() 46400 1727204539.49444: filtering new block on tags 46400 1727204539.49483: done filtering new block on tags 46400 1727204539.49487: in VariableManager get_vars() 46400 1727204539.49514: done with get_vars() 46400 1727204539.49516: filtering new block on tags 46400 1727204539.49562: done filtering new block on tags 46400 1727204539.49566: in VariableManager get_vars() 46400 1727204539.49589: done with get_vars() 46400 1727204539.49590: filtering new block on tags 46400 1727204539.49633: done filtering new block on tags 46400 1727204539.49635: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204539.49641: extending task lists for all hosts with included blocks 46400 1727204539.51504: done extending task lists 46400 1727204539.51506: done processing included files 46400 1727204539.51507: results queue empty 46400 1727204539.51507: checking for any_errors_fatal 46400 1727204539.51511: done checking for any_errors_fatal 46400 1727204539.51512: checking for max_fail_percentage 46400 1727204539.51513: done checking for max_fail_percentage 46400 1727204539.51514: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.51515: done checking to see if all hosts have failed 46400 1727204539.51515: getting the remaining hosts for this loop 46400 1727204539.51517: done getting the remaining hosts for this loop 46400 1727204539.51519: getting the next task for host managed-node2 46400 1727204539.51524: done getting next task for host managed-node2 46400 1727204539.51527: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204539.51531: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.51542: getting variables 46400 1727204539.51544: in VariableManager get_vars() 46400 1727204539.51568: Calling all_inventory to load vars for managed-node2 46400 1727204539.51571: Calling groups_inventory to load vars for managed-node2 46400 1727204539.51573: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.51580: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.51582: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.51585: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.52957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.54673: done with get_vars() 46400 1727204539.54701: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.115) 0:00:29.832 ***** 46400 1727204539.54793: entering _queue_task() for managed-node2/setup 46400 1727204539.55156: worker is 1 (out of 1 available) 46400 1727204539.55174: exiting _queue_task() for managed-node2/setup 46400 1727204539.55187: done queuing things up, now waiting for results queue to drain 46400 1727204539.55189: waiting for pending results... 46400 1727204539.55495: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204539.55677: in run() - task 0affcd87-79f5-1303-fda8-000000000b90 46400 1727204539.55701: variable 'ansible_search_path' from source: unknown 46400 1727204539.55708: variable 'ansible_search_path' from source: unknown 46400 1727204539.55752: calling self._execute() 46400 1727204539.55858: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.55875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.55889: variable 'omit' from source: magic vars 46400 1727204539.56280: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.56298: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.56548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204539.59104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204539.59177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204539.59225: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204539.59271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204539.59306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204539.59393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204539.59430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204539.59468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204539.59520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204539.59537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204539.59595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204539.59619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204539.59649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204539.59705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204539.59726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204539.59907: variable '__network_required_facts' from source: role '' defaults 46400 1727204539.59922: variable 'ansible_facts' from source: unknown 46400 1727204539.60770: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204539.60780: when evaluation is False, skipping this task 46400 1727204539.60786: _execute() done 46400 1727204539.60794: dumping result to json 46400 1727204539.60801: done dumping result, returning 46400 1727204539.60813: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000000b90] 46400 1727204539.60829: sending task result for task 0affcd87-79f5-1303-fda8-000000000b90 46400 1727204539.60951: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b90 46400 1727204539.60959: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204539.61009: no more pending results, returning what we have 46400 1727204539.61014: results queue empty 46400 1727204539.61015: checking for any_errors_fatal 46400 1727204539.61017: done checking for any_errors_fatal 46400 1727204539.61018: checking for max_fail_percentage 46400 1727204539.61020: done checking for max_fail_percentage 46400 1727204539.61021: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.61022: done checking to see if all hosts have failed 46400 1727204539.61023: getting the remaining hosts for this loop 46400 1727204539.61025: done getting the remaining hosts for this loop 46400 1727204539.61030: getting the next task for host managed-node2 46400 1727204539.61041: done getting next task for host managed-node2 46400 1727204539.61046: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204539.61054: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.61081: getting variables 46400 1727204539.61083: in VariableManager get_vars() 46400 1727204539.61124: Calling all_inventory to load vars for managed-node2 46400 1727204539.61127: Calling groups_inventory to load vars for managed-node2 46400 1727204539.61130: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.61142: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.61145: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.61155: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.62786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.63842: done with get_vars() 46400 1727204539.63863: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.091) 0:00:29.923 ***** 46400 1727204539.63939: entering _queue_task() for managed-node2/stat 46400 1727204539.64203: worker is 1 (out of 1 available) 46400 1727204539.64215: exiting _queue_task() for managed-node2/stat 46400 1727204539.64228: done queuing things up, now waiting for results queue to drain 46400 1727204539.64230: waiting for pending results... 46400 1727204539.64502: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204539.64685: in run() - task 0affcd87-79f5-1303-fda8-000000000b92 46400 1727204539.64704: variable 'ansible_search_path' from source: unknown 46400 1727204539.64711: variable 'ansible_search_path' from source: unknown 46400 1727204539.64755: calling self._execute() 46400 1727204539.64859: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.64872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.64888: variable 'omit' from source: magic vars 46400 1727204539.65262: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.65284: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.65454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204539.65711: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204539.65744: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204539.65774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204539.65796: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204539.65858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204539.65883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204539.65899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204539.65918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204539.65986: variable '__network_is_ostree' from source: set_fact 46400 1727204539.65989: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204539.65992: when evaluation is False, skipping this task 46400 1727204539.65995: _execute() done 46400 1727204539.65998: dumping result to json 46400 1727204539.66005: done dumping result, returning 46400 1727204539.66009: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000000b92] 46400 1727204539.66014: sending task result for task 0affcd87-79f5-1303-fda8-000000000b92 46400 1727204539.66103: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b92 46400 1727204539.66108: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204539.66185: no more pending results, returning what we have 46400 1727204539.66189: results queue empty 46400 1727204539.66190: checking for any_errors_fatal 46400 1727204539.66198: done checking for any_errors_fatal 46400 1727204539.66199: checking for max_fail_percentage 46400 1727204539.66201: done checking for max_fail_percentage 46400 1727204539.66201: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.66202: done checking to see if all hosts have failed 46400 1727204539.66203: getting the remaining hosts for this loop 46400 1727204539.66205: done getting the remaining hosts for this loop 46400 1727204539.66208: getting the next task for host managed-node2 46400 1727204539.66218: done getting next task for host managed-node2 46400 1727204539.66222: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204539.66227: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.66242: getting variables 46400 1727204539.66244: in VariableManager get_vars() 46400 1727204539.66280: Calling all_inventory to load vars for managed-node2 46400 1727204539.66283: Calling groups_inventory to load vars for managed-node2 46400 1727204539.66285: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.66294: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.66296: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.66298: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.67101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.68615: done with get_vars() 46400 1727204539.68641: done getting variables 46400 1727204539.68706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.048) 0:00:29.972 ***** 46400 1727204539.68749: entering _queue_task() for managed-node2/set_fact 46400 1727204539.69088: worker is 1 (out of 1 available) 46400 1727204539.69101: exiting _queue_task() for managed-node2/set_fact 46400 1727204539.69114: done queuing things up, now waiting for results queue to drain 46400 1727204539.69116: waiting for pending results... 46400 1727204539.69482: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204539.69590: in run() - task 0affcd87-79f5-1303-fda8-000000000b93 46400 1727204539.69606: variable 'ansible_search_path' from source: unknown 46400 1727204539.69610: variable 'ansible_search_path' from source: unknown 46400 1727204539.69637: calling self._execute() 46400 1727204539.69714: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.69718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.69727: variable 'omit' from source: magic vars 46400 1727204539.69993: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.70002: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.70123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204539.70315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204539.70350: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204539.70377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204539.70401: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204539.70468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204539.70487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204539.70523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204539.70542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204539.70608: variable '__network_is_ostree' from source: set_fact 46400 1727204539.70613: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204539.70616: when evaluation is False, skipping this task 46400 1727204539.70619: _execute() done 46400 1727204539.70622: dumping result to json 46400 1727204539.70624: done dumping result, returning 46400 1727204539.70631: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000000b93] 46400 1727204539.70636: sending task result for task 0affcd87-79f5-1303-fda8-000000000b93 46400 1727204539.70722: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b93 46400 1727204539.70725: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204539.70800: no more pending results, returning what we have 46400 1727204539.70804: results queue empty 46400 1727204539.70805: checking for any_errors_fatal 46400 1727204539.70812: done checking for any_errors_fatal 46400 1727204539.70813: checking for max_fail_percentage 46400 1727204539.70815: done checking for max_fail_percentage 46400 1727204539.70815: checking to see if all hosts have failed and the running result is not ok 46400 1727204539.70817: done checking to see if all hosts have failed 46400 1727204539.70817: getting the remaining hosts for this loop 46400 1727204539.70819: done getting the remaining hosts for this loop 46400 1727204539.70822: getting the next task for host managed-node2 46400 1727204539.70835: done getting next task for host managed-node2 46400 1727204539.70839: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204539.70844: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204539.70862: getting variables 46400 1727204539.70865: in VariableManager get_vars() 46400 1727204539.70896: Calling all_inventory to load vars for managed-node2 46400 1727204539.70899: Calling groups_inventory to load vars for managed-node2 46400 1727204539.70901: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204539.70909: Calling all_plugins_play to load vars for managed-node2 46400 1727204539.70912: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204539.70914: Calling groups_plugins_play to load vars for managed-node2 46400 1727204539.72921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204539.74419: done with get_vars() 46400 1727204539.74444: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:19 -0400 (0:00:00.057) 0:00:30.029 ***** 46400 1727204539.74521: entering _queue_task() for managed-node2/service_facts 46400 1727204539.74763: worker is 1 (out of 1 available) 46400 1727204539.74778: exiting _queue_task() for managed-node2/service_facts 46400 1727204539.74793: done queuing things up, now waiting for results queue to drain 46400 1727204539.74795: waiting for pending results... 46400 1727204539.75029: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204539.75646: in run() - task 0affcd87-79f5-1303-fda8-000000000b95 46400 1727204539.75651: variable 'ansible_search_path' from source: unknown 46400 1727204539.75653: variable 'ansible_search_path' from source: unknown 46400 1727204539.75656: calling self._execute() 46400 1727204539.75658: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.75661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.75666: variable 'omit' from source: magic vars 46400 1727204539.75823: variable 'ansible_distribution_major_version' from source: facts 46400 1727204539.75827: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204539.75830: variable 'omit' from source: magic vars 46400 1727204539.75902: variable 'omit' from source: magic vars 46400 1727204539.75934: variable 'omit' from source: magic vars 46400 1727204539.75977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204539.76024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204539.76028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204539.76042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.76052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204539.76209: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204539.76213: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.76215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.76542: Set connection var ansible_shell_type to sh 46400 1727204539.76560: Set connection var ansible_shell_executable to /bin/sh 46400 1727204539.76573: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204539.76584: Set connection var ansible_connection to ssh 46400 1727204539.76598: Set connection var ansible_pipelining to False 46400 1727204539.76607: Set connection var ansible_timeout to 10 46400 1727204539.76637: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.76650: variable 'ansible_connection' from source: unknown 46400 1727204539.76657: variable 'ansible_module_compression' from source: unknown 46400 1727204539.76666: variable 'ansible_shell_type' from source: unknown 46400 1727204539.76674: variable 'ansible_shell_executable' from source: unknown 46400 1727204539.76681: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204539.76689: variable 'ansible_pipelining' from source: unknown 46400 1727204539.76700: variable 'ansible_timeout' from source: unknown 46400 1727204539.76709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204539.76938: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204539.76954: variable 'omit' from source: magic vars 46400 1727204539.76965: starting attempt loop 46400 1727204539.76973: running the handler 46400 1727204539.76991: _low_level_execute_command(): starting 46400 1727204539.77004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204539.77898: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.77904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.77933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.77937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.78007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.78030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.78109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.79756: stdout chunk (state=3): >>>/root <<< 46400 1727204539.79871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204539.79918: stderr chunk (state=3): >>><<< 46400 1727204539.79922: stdout chunk (state=3): >>><<< 46400 1727204539.79941: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204539.79953: _low_level_execute_command(): starting 46400 1727204539.79958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108 `" && echo ansible-tmp-1727204539.7994113-48810-42559969098108="` echo /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108 `" ) && sleep 0' 46400 1727204539.80397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.80403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.80448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.80452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.80469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.80522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204539.80526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.80533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.80615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.82445: stdout chunk (state=3): >>>ansible-tmp-1727204539.7994113-48810-42559969098108=/root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108 <<< 46400 1727204539.82558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204539.82609: stderr chunk (state=3): >>><<< 46400 1727204539.82612: stdout chunk (state=3): >>><<< 46400 1727204539.82625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204539.7994113-48810-42559969098108=/root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204539.82673: variable 'ansible_module_compression' from source: unknown 46400 1727204539.82707: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204539.82743: variable 'ansible_facts' from source: unknown 46400 1727204539.82801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/AnsiballZ_service_facts.py 46400 1727204539.82911: Sending initial data 46400 1727204539.82915: Sent initial data (161 bytes) 46400 1727204539.83746: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204539.83755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.83770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.83785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.83822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.83829: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204539.83841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.83853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204539.83862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204539.83880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204539.83888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.83897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.83909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.83916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.83923: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204539.83933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.84006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204539.84022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.84033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.84100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.85796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204539.85820: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204539.85856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpmhda9klq /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/AnsiballZ_service_facts.py <<< 46400 1727204539.85890: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204539.86973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204539.86976: stdout chunk (state=3): >>><<< 46400 1727204539.86979: stderr chunk (state=3): >>><<< 46400 1727204539.86981: done transferring module to remote 46400 1727204539.86983: _low_level_execute_command(): starting 46400 1727204539.87076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/ /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/AnsiballZ_service_facts.py && sleep 0' 46400 1727204539.87676: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204539.87690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.87706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.87723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.87774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.87787: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204539.87801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.87819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204539.87830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204539.87842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204539.87853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.87883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.87900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.87914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.87925: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204539.87939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.88023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204539.88050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.88084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.88156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204539.89844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204539.89898: stderr chunk (state=3): >>><<< 46400 1727204539.89903: stdout chunk (state=3): >>><<< 46400 1727204539.89918: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204539.89921: _low_level_execute_command(): starting 46400 1727204539.89927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/AnsiballZ_service_facts.py && sleep 0' 46400 1727204539.90521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204539.90530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.90541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.90557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.90609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.90616: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204539.90626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.90639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204539.90646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204539.90652: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204539.90660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204539.90679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204539.90692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204539.90703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204539.90709: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204539.90718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204539.90799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204539.90818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204539.90828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204539.90901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.21898: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 46400 1727204541.21912: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status":<<< 46400 1727204541.21917: stdout chunk (state=3): >>> "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "sourc<<< 46400 1727204541.21939: stdout chunk (state=3): >>>e": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-au<<< 46400 1727204541.21961: stdout chunk (state=3): >>>tofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204541.23339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204541.23343: stdout chunk (state=3): >>><<< 46400 1727204541.23345: stderr chunk (state=3): >>><<< 46400 1727204541.23374: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204541.24075: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204541.24095: _low_level_execute_command(): starting 46400 1727204541.24109: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204539.7994113-48810-42559969098108/ > /dev/null 2>&1 && sleep 0' 46400 1727204541.24921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.24992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.25006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.25023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.25069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.25080: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.25093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.25110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.25121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.25142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.25155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.25170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.25184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.25194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.25203: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.25214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.25297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.25318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.25331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.25397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.27186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204541.27299: stderr chunk (state=3): >>><<< 46400 1727204541.27303: stdout chunk (state=3): >>><<< 46400 1727204541.27574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204541.27577: handler run complete 46400 1727204541.27580: variable 'ansible_facts' from source: unknown 46400 1727204541.27687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204541.28431: variable 'ansible_facts' from source: unknown 46400 1727204541.28578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204541.28887: attempt loop complete, returning result 46400 1727204541.28982: _execute() done 46400 1727204541.28990: dumping result to json 46400 1727204541.29057: done dumping result, returning 46400 1727204541.29074: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000000b95] 46400 1727204541.29084: sending task result for task 0affcd87-79f5-1303-fda8-000000000b95 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204541.29977: no more pending results, returning what we have 46400 1727204541.29982: results queue empty 46400 1727204541.29985: checking for any_errors_fatal 46400 1727204541.29992: done checking for any_errors_fatal 46400 1727204541.29993: checking for max_fail_percentage 46400 1727204541.29995: done checking for max_fail_percentage 46400 1727204541.29996: checking to see if all hosts have failed and the running result is not ok 46400 1727204541.29997: done checking to see if all hosts have failed 46400 1727204541.29997: getting the remaining hosts for this loop 46400 1727204541.29999: done getting the remaining hosts for this loop 46400 1727204541.30003: getting the next task for host managed-node2 46400 1727204541.30011: done getting next task for host managed-node2 46400 1727204541.30015: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204541.30021: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204541.30034: getting variables 46400 1727204541.30036: in VariableManager get_vars() 46400 1727204541.30087: Calling all_inventory to load vars for managed-node2 46400 1727204541.30091: Calling groups_inventory to load vars for managed-node2 46400 1727204541.30093: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204541.30105: Calling all_plugins_play to load vars for managed-node2 46400 1727204541.30107: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204541.30110: Calling groups_plugins_play to load vars for managed-node2 46400 1727204541.31134: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b95 46400 1727204541.31138: WORKER PROCESS EXITING 46400 1727204541.32287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204541.34742: done with get_vars() 46400 1727204541.34780: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:21 -0400 (0:00:01.603) 0:00:31.633 ***** 46400 1727204541.34898: entering _queue_task() for managed-node2/package_facts 46400 1727204541.35329: worker is 1 (out of 1 available) 46400 1727204541.35348: exiting _queue_task() for managed-node2/package_facts 46400 1727204541.35361: done queuing things up, now waiting for results queue to drain 46400 1727204541.35363: waiting for pending results... 46400 1727204541.35673: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204541.35846: in run() - task 0affcd87-79f5-1303-fda8-000000000b96 46400 1727204541.35868: variable 'ansible_search_path' from source: unknown 46400 1727204541.35876: variable 'ansible_search_path' from source: unknown 46400 1727204541.35927: calling self._execute() 46400 1727204541.36036: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204541.36049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204541.36066: variable 'omit' from source: magic vars 46400 1727204541.36475: variable 'ansible_distribution_major_version' from source: facts 46400 1727204541.36493: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204541.36505: variable 'omit' from source: magic vars 46400 1727204541.36601: variable 'omit' from source: magic vars 46400 1727204541.36642: variable 'omit' from source: magic vars 46400 1727204541.36698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204541.36741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204541.36780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204541.36805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204541.36823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204541.36858: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204541.36874: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204541.36882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204541.36990: Set connection var ansible_shell_type to sh 46400 1727204541.37012: Set connection var ansible_shell_executable to /bin/sh 46400 1727204541.37022: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204541.37032: Set connection var ansible_connection to ssh 46400 1727204541.37041: Set connection var ansible_pipelining to False 46400 1727204541.37050: Set connection var ansible_timeout to 10 46400 1727204541.37080: variable 'ansible_shell_executable' from source: unknown 46400 1727204541.37093: variable 'ansible_connection' from source: unknown 46400 1727204541.37100: variable 'ansible_module_compression' from source: unknown 46400 1727204541.37109: variable 'ansible_shell_type' from source: unknown 46400 1727204541.37116: variable 'ansible_shell_executable' from source: unknown 46400 1727204541.37122: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204541.37129: variable 'ansible_pipelining' from source: unknown 46400 1727204541.37136: variable 'ansible_timeout' from source: unknown 46400 1727204541.37143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204541.37354: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204541.37371: variable 'omit' from source: magic vars 46400 1727204541.37379: starting attempt loop 46400 1727204541.37385: running the handler 46400 1727204541.37401: _low_level_execute_command(): starting 46400 1727204541.37416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204541.38209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.38223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.38235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.38251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.38299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.38316: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.38332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.38352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.38367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.38379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.38398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.38413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.38432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.38448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.38462: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.38480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.38567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.38585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.38600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.38680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.40252: stdout chunk (state=3): >>>/root <<< 46400 1727204541.40432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204541.40436: stdout chunk (state=3): >>><<< 46400 1727204541.40447: stderr chunk (state=3): >>><<< 46400 1727204541.40475: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204541.40487: _low_level_execute_command(): starting 46400 1727204541.40494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120 `" && echo ansible-tmp-1727204541.404729-48952-115426845393120="` echo /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120 `" ) && sleep 0' 46400 1727204541.41143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.41153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.41169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.41183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.41223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.41231: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.41241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.41254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.41261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.41275: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.41283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.41292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.41304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.41312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.41319: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.41328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.41404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.41418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.41424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.41499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.43355: stdout chunk (state=3): >>>ansible-tmp-1727204541.404729-48952-115426845393120=/root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120 <<< 46400 1727204541.43472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204541.43559: stderr chunk (state=3): >>><<< 46400 1727204541.43569: stdout chunk (state=3): >>><<< 46400 1727204541.43594: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204541.404729-48952-115426845393120=/root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204541.43645: variable 'ansible_module_compression' from source: unknown 46400 1727204541.43703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204541.43765: variable 'ansible_facts' from source: unknown 46400 1727204541.43955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/AnsiballZ_package_facts.py 46400 1727204541.44122: Sending initial data 46400 1727204541.44125: Sent initial data (161 bytes) 46400 1727204541.45149: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.45160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.45176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.45191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.45232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.45241: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.45250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.45271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.45282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.45285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.45293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.45302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.45314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.45322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.45329: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.45338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.45412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.45430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.45443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.45509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.47240: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204541.47278: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204541.47318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpx8sc398n /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/AnsiballZ_package_facts.py <<< 46400 1727204541.47370: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204541.49915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204541.50011: stderr chunk (state=3): >>><<< 46400 1727204541.50014: stdout chunk (state=3): >>><<< 46400 1727204541.50039: done transferring module to remote 46400 1727204541.50051: _low_level_execute_command(): starting 46400 1727204541.50056: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/ /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/AnsiballZ_package_facts.py && sleep 0' 46400 1727204541.50780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.50791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.50810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.50824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.50871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.50878: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.50890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.50904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.50918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.50925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.50933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.50942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.50953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.50961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.50974: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.50984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.51065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.51086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.51098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.51162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204541.52953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204541.52957: stdout chunk (state=3): >>><<< 46400 1727204541.52968: stderr chunk (state=3): >>><<< 46400 1727204541.52987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204541.52990: _low_level_execute_command(): starting 46400 1727204541.52996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/AnsiballZ_package_facts.py && sleep 0' 46400 1727204541.53656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204541.53669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.53681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.53695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.53736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.53740: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204541.53751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.53773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204541.53781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204541.53788: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204541.53796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204541.53805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204541.53817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204541.53824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204541.53831: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204541.53844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204541.53917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204541.53932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204541.53937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204541.54020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204542.00320: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204542.00454: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204542.00473: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204542.01960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204542.02023: stderr chunk (state=3): >>><<< 46400 1727204542.02026: stdout chunk (state=3): >>><<< 46400 1727204542.02062: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204542.04188: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204542.04205: _low_level_execute_command(): starting 46400 1727204542.04208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204541.404729-48952-115426845393120/ > /dev/null 2>&1 && sleep 0' 46400 1727204542.04763: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204542.04780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204542.04783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204542.04798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204542.04889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204542.04893: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204542.04895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204542.04897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204542.04900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204542.04902: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204542.04904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204542.04906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204542.04912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204542.04918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204542.04925: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204542.04934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204542.05011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204542.05028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204542.05041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204542.05110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204542.06991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204542.06995: stdout chunk (state=3): >>><<< 46400 1727204542.06998: stderr chunk (state=3): >>><<< 46400 1727204542.07018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204542.07023: handler run complete 46400 1727204542.08172: variable 'ansible_facts' from source: unknown 46400 1727204542.08511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.10635: variable 'ansible_facts' from source: unknown 46400 1727204542.10967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.11551: attempt loop complete, returning result 46400 1727204542.11565: _execute() done 46400 1727204542.11568: dumping result to json 46400 1727204542.11696: done dumping result, returning 46400 1727204542.11705: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000000b96] 46400 1727204542.11710: sending task result for task 0affcd87-79f5-1303-fda8-000000000b96 46400 1727204542.14202: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b96 46400 1727204542.14206: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204542.14339: no more pending results, returning what we have 46400 1727204542.14342: results queue empty 46400 1727204542.14343: checking for any_errors_fatal 46400 1727204542.14349: done checking for any_errors_fatal 46400 1727204542.14350: checking for max_fail_percentage 46400 1727204542.14351: done checking for max_fail_percentage 46400 1727204542.14352: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.14353: done checking to see if all hosts have failed 46400 1727204542.14353: getting the remaining hosts for this loop 46400 1727204542.14354: done getting the remaining hosts for this loop 46400 1727204542.14357: getting the next task for host managed-node2 46400 1727204542.14369: done getting next task for host managed-node2 46400 1727204542.14373: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204542.14377: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.14388: getting variables 46400 1727204542.14389: in VariableManager get_vars() 46400 1727204542.14417: Calling all_inventory to load vars for managed-node2 46400 1727204542.14419: Calling groups_inventory to load vars for managed-node2 46400 1727204542.14426: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.14434: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.14436: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.14439: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.15615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.17105: done with get_vars() 46400 1727204542.17122: done getting variables 46400 1727204542.17172: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.823) 0:00:32.456 ***** 46400 1727204542.17201: entering _queue_task() for managed-node2/debug 46400 1727204542.17447: worker is 1 (out of 1 available) 46400 1727204542.17462: exiting _queue_task() for managed-node2/debug 46400 1727204542.17478: done queuing things up, now waiting for results queue to drain 46400 1727204542.17480: waiting for pending results... 46400 1727204542.17668: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204542.17742: in run() - task 0affcd87-79f5-1303-fda8-000000000b34 46400 1727204542.17755: variable 'ansible_search_path' from source: unknown 46400 1727204542.17758: variable 'ansible_search_path' from source: unknown 46400 1727204542.17789: calling self._execute() 46400 1727204542.17865: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.17870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.17879: variable 'omit' from source: magic vars 46400 1727204542.18157: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.18169: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.18175: variable 'omit' from source: magic vars 46400 1727204542.18217: variable 'omit' from source: magic vars 46400 1727204542.18291: variable 'network_provider' from source: set_fact 46400 1727204542.18304: variable 'omit' from source: magic vars 46400 1727204542.18342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204542.18372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204542.18402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204542.18428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204542.18444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204542.18480: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204542.18489: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.18496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.18594: Set connection var ansible_shell_type to sh 46400 1727204542.18609: Set connection var ansible_shell_executable to /bin/sh 46400 1727204542.18618: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204542.18628: Set connection var ansible_connection to ssh 46400 1727204542.18637: Set connection var ansible_pipelining to False 46400 1727204542.18646: Set connection var ansible_timeout to 10 46400 1727204542.18679: variable 'ansible_shell_executable' from source: unknown 46400 1727204542.18687: variable 'ansible_connection' from source: unknown 46400 1727204542.18694: variable 'ansible_module_compression' from source: unknown 46400 1727204542.18700: variable 'ansible_shell_type' from source: unknown 46400 1727204542.18706: variable 'ansible_shell_executable' from source: unknown 46400 1727204542.18712: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.18719: variable 'ansible_pipelining' from source: unknown 46400 1727204542.18726: variable 'ansible_timeout' from source: unknown 46400 1727204542.18732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.18883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204542.18900: variable 'omit' from source: magic vars 46400 1727204542.18909: starting attempt loop 46400 1727204542.18915: running the handler 46400 1727204542.18960: handler run complete 46400 1727204542.18982: attempt loop complete, returning result 46400 1727204542.18988: _execute() done 46400 1727204542.18994: dumping result to json 46400 1727204542.19014: done dumping result, returning 46400 1727204542.19017: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000000b34] 46400 1727204542.19025: sending task result for task 0affcd87-79f5-1303-fda8-000000000b34 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204542.19191: no more pending results, returning what we have 46400 1727204542.19196: results queue empty 46400 1727204542.19197: checking for any_errors_fatal 46400 1727204542.19208: done checking for any_errors_fatal 46400 1727204542.19208: checking for max_fail_percentage 46400 1727204542.19212: done checking for max_fail_percentage 46400 1727204542.19213: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.19213: done checking to see if all hosts have failed 46400 1727204542.19214: getting the remaining hosts for this loop 46400 1727204542.19216: done getting the remaining hosts for this loop 46400 1727204542.19219: getting the next task for host managed-node2 46400 1727204542.19228: done getting next task for host managed-node2 46400 1727204542.19233: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204542.19239: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.19257: getting variables 46400 1727204542.19259: in VariableManager get_vars() 46400 1727204542.19307: Calling all_inventory to load vars for managed-node2 46400 1727204542.19309: Calling groups_inventory to load vars for managed-node2 46400 1727204542.19312: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.19324: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.19335: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.19340: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.19891: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b34 46400 1727204542.19895: WORKER PROCESS EXITING 46400 1727204542.20441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.21617: done with get_vars() 46400 1727204542.21642: done getting variables 46400 1727204542.21699: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.045) 0:00:32.501 ***** 46400 1727204542.21740: entering _queue_task() for managed-node2/fail 46400 1727204542.22117: worker is 1 (out of 1 available) 46400 1727204542.22130: exiting _queue_task() for managed-node2/fail 46400 1727204542.22143: done queuing things up, now waiting for results queue to drain 46400 1727204542.22145: waiting for pending results... 46400 1727204542.22526: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204542.22677: in run() - task 0affcd87-79f5-1303-fda8-000000000b35 46400 1727204542.22704: variable 'ansible_search_path' from source: unknown 46400 1727204542.22710: variable 'ansible_search_path' from source: unknown 46400 1727204542.22771: calling self._execute() 46400 1727204542.22862: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.22868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.22876: variable 'omit' from source: magic vars 46400 1727204542.23311: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.23315: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.23422: variable 'network_state' from source: role '' defaults 46400 1727204542.23431: Evaluated conditional (network_state != {}): False 46400 1727204542.23435: when evaluation is False, skipping this task 46400 1727204542.23451: _execute() done 46400 1727204542.23454: dumping result to json 46400 1727204542.23466: done dumping result, returning 46400 1727204542.23470: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000000b35] 46400 1727204542.23473: sending task result for task 0affcd87-79f5-1303-fda8-000000000b35 46400 1727204542.23701: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b35 46400 1727204542.23752: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204542.23818: no more pending results, returning what we have 46400 1727204542.23822: results queue empty 46400 1727204542.23824: checking for any_errors_fatal 46400 1727204542.23832: done checking for any_errors_fatal 46400 1727204542.23833: checking for max_fail_percentage 46400 1727204542.23835: done checking for max_fail_percentage 46400 1727204542.23836: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.23836: done checking to see if all hosts have failed 46400 1727204542.23837: getting the remaining hosts for this loop 46400 1727204542.23838: done getting the remaining hosts for this loop 46400 1727204542.23844: getting the next task for host managed-node2 46400 1727204542.23854: done getting next task for host managed-node2 46400 1727204542.23859: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204542.23868: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.23896: getting variables 46400 1727204542.23901: in VariableManager get_vars() 46400 1727204542.23941: Calling all_inventory to load vars for managed-node2 46400 1727204542.23944: Calling groups_inventory to load vars for managed-node2 46400 1727204542.23946: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.23957: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.23959: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.23965: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.30245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.31487: done with get_vars() 46400 1727204542.31514: done getting variables 46400 1727204542.31551: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.098) 0:00:32.600 ***** 46400 1727204542.31580: entering _queue_task() for managed-node2/fail 46400 1727204542.31834: worker is 1 (out of 1 available) 46400 1727204542.31849: exiting _queue_task() for managed-node2/fail 46400 1727204542.31868: done queuing things up, now waiting for results queue to drain 46400 1727204542.31871: waiting for pending results... 46400 1727204542.32076: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204542.32193: in run() - task 0affcd87-79f5-1303-fda8-000000000b36 46400 1727204542.32205: variable 'ansible_search_path' from source: unknown 46400 1727204542.32209: variable 'ansible_search_path' from source: unknown 46400 1727204542.32238: calling self._execute() 46400 1727204542.32371: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.32390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.32406: variable 'omit' from source: magic vars 46400 1727204542.32779: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.32789: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.32911: variable 'network_state' from source: role '' defaults 46400 1727204542.32920: Evaluated conditional (network_state != {}): False 46400 1727204542.32924: when evaluation is False, skipping this task 46400 1727204542.32927: _execute() done 46400 1727204542.32932: dumping result to json 46400 1727204542.32935: done dumping result, returning 46400 1727204542.32939: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000000b36] 46400 1727204542.32945: sending task result for task 0affcd87-79f5-1303-fda8-000000000b36 46400 1727204542.33037: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b36 46400 1727204542.33039: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204542.33094: no more pending results, returning what we have 46400 1727204542.33107: results queue empty 46400 1727204542.33108: checking for any_errors_fatal 46400 1727204542.33119: done checking for any_errors_fatal 46400 1727204542.33120: checking for max_fail_percentage 46400 1727204542.33121: done checking for max_fail_percentage 46400 1727204542.33122: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.33123: done checking to see if all hosts have failed 46400 1727204542.33124: getting the remaining hosts for this loop 46400 1727204542.33125: done getting the remaining hosts for this loop 46400 1727204542.33129: getting the next task for host managed-node2 46400 1727204542.33138: done getting next task for host managed-node2 46400 1727204542.33142: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204542.33148: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.33176: getting variables 46400 1727204542.33178: in VariableManager get_vars() 46400 1727204542.33210: Calling all_inventory to load vars for managed-node2 46400 1727204542.33213: Calling groups_inventory to load vars for managed-node2 46400 1727204542.33215: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.33224: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.33226: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.33228: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.34212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.35478: done with get_vars() 46400 1727204542.35511: done getting variables 46400 1727204542.35574: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.040) 0:00:32.640 ***** 46400 1727204542.35606: entering _queue_task() for managed-node2/fail 46400 1727204542.35962: worker is 1 (out of 1 available) 46400 1727204542.35978: exiting _queue_task() for managed-node2/fail 46400 1727204542.35993: done queuing things up, now waiting for results queue to drain 46400 1727204542.35995: waiting for pending results... 46400 1727204542.36272: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204542.36404: in run() - task 0affcd87-79f5-1303-fda8-000000000b37 46400 1727204542.36414: variable 'ansible_search_path' from source: unknown 46400 1727204542.36418: variable 'ansible_search_path' from source: unknown 46400 1727204542.36449: calling self._execute() 46400 1727204542.36543: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.36548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.36558: variable 'omit' from source: magic vars 46400 1727204542.36948: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.36963: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.37173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.39675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.39720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.39784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.39815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.39840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.39917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.39939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.39974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.40003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.40014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.40120: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.40132: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204542.40136: when evaluation is False, skipping this task 46400 1727204542.40139: _execute() done 46400 1727204542.40143: dumping result to json 46400 1727204542.40146: done dumping result, returning 46400 1727204542.40171: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000000b37] 46400 1727204542.40175: sending task result for task 0affcd87-79f5-1303-fda8-000000000b37 46400 1727204542.40300: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b37 46400 1727204542.40303: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204542.40345: no more pending results, returning what we have 46400 1727204542.40348: results queue empty 46400 1727204542.40349: checking for any_errors_fatal 46400 1727204542.40356: done checking for any_errors_fatal 46400 1727204542.40356: checking for max_fail_percentage 46400 1727204542.40359: done checking for max_fail_percentage 46400 1727204542.40359: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.40360: done checking to see if all hosts have failed 46400 1727204542.40361: getting the remaining hosts for this loop 46400 1727204542.40362: done getting the remaining hosts for this loop 46400 1727204542.40369: getting the next task for host managed-node2 46400 1727204542.40383: done getting next task for host managed-node2 46400 1727204542.40388: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204542.40424: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.40445: getting variables 46400 1727204542.40447: in VariableManager get_vars() 46400 1727204542.40482: Calling all_inventory to load vars for managed-node2 46400 1727204542.40485: Calling groups_inventory to load vars for managed-node2 46400 1727204542.40493: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.40503: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.40506: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.40509: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.41794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.42923: done with get_vars() 46400 1727204542.42941: done getting variables 46400 1727204542.42995: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.074) 0:00:32.714 ***** 46400 1727204542.43025: entering _queue_task() for managed-node2/dnf 46400 1727204542.43293: worker is 1 (out of 1 available) 46400 1727204542.43310: exiting _queue_task() for managed-node2/dnf 46400 1727204542.43324: done queuing things up, now waiting for results queue to drain 46400 1727204542.43326: waiting for pending results... 46400 1727204542.43544: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204542.43683: in run() - task 0affcd87-79f5-1303-fda8-000000000b38 46400 1727204542.43705: variable 'ansible_search_path' from source: unknown 46400 1727204542.43708: variable 'ansible_search_path' from source: unknown 46400 1727204542.43729: calling self._execute() 46400 1727204542.43825: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.43872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.43876: variable 'omit' from source: magic vars 46400 1727204542.44231: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.44251: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.44429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.46760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.46851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.46885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.46919: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.46946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.47041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.47071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.47106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.47152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.47186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.47344: variable 'ansible_distribution' from source: facts 46400 1727204542.47348: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.47351: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204542.47504: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204542.47673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.47678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.47680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.47858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.47862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.47866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.47869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.47872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.47874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.47876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.47905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.47928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.47952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.47995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.48008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.48193: variable 'network_connections' from source: include params 46400 1727204542.48263: variable 'interface' from source: play vars 46400 1727204542.48294: variable 'interface' from source: play vars 46400 1727204542.48343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204542.48526: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204542.48569: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204542.48597: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204542.48627: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204542.48682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204542.48722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204542.48733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.48744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204542.48806: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204542.49125: variable 'network_connections' from source: include params 46400 1727204542.49128: variable 'interface' from source: play vars 46400 1727204542.49131: variable 'interface' from source: play vars 46400 1727204542.49146: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204542.49149: when evaluation is False, skipping this task 46400 1727204542.49152: _execute() done 46400 1727204542.49154: dumping result to json 46400 1727204542.49156: done dumping result, returning 46400 1727204542.49171: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000b38] 46400 1727204542.49177: sending task result for task 0affcd87-79f5-1303-fda8-000000000b38 46400 1727204542.49278: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b38 46400 1727204542.49281: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204542.49325: no more pending results, returning what we have 46400 1727204542.49328: results queue empty 46400 1727204542.49329: checking for any_errors_fatal 46400 1727204542.49336: done checking for any_errors_fatal 46400 1727204542.49337: checking for max_fail_percentage 46400 1727204542.49339: done checking for max_fail_percentage 46400 1727204542.49339: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.49340: done checking to see if all hosts have failed 46400 1727204542.49341: getting the remaining hosts for this loop 46400 1727204542.49343: done getting the remaining hosts for this loop 46400 1727204542.49346: getting the next task for host managed-node2 46400 1727204542.49355: done getting next task for host managed-node2 46400 1727204542.49358: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204542.49363: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.49392: getting variables 46400 1727204542.49394: in VariableManager get_vars() 46400 1727204542.49430: Calling all_inventory to load vars for managed-node2 46400 1727204542.49433: Calling groups_inventory to load vars for managed-node2 46400 1727204542.49435: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.49444: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.49446: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.49448: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.50967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.52815: done with get_vars() 46400 1727204542.52853: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204542.52937: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.099) 0:00:32.814 ***** 46400 1727204542.52983: entering _queue_task() for managed-node2/yum 46400 1727204542.53349: worker is 1 (out of 1 available) 46400 1727204542.53362: exiting _queue_task() for managed-node2/yum 46400 1727204542.53381: done queuing things up, now waiting for results queue to drain 46400 1727204542.53384: waiting for pending results... 46400 1727204542.53717: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204542.53823: in run() - task 0affcd87-79f5-1303-fda8-000000000b39 46400 1727204542.53837: variable 'ansible_search_path' from source: unknown 46400 1727204542.53840: variable 'ansible_search_path' from source: unknown 46400 1727204542.53876: calling self._execute() 46400 1727204542.53956: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.53967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.53976: variable 'omit' from source: magic vars 46400 1727204542.54253: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.54263: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.54397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.56329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.56387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.56414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.56439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.56461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.56520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.56539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.56557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.56592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.56603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.56671: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.56684: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204542.56687: when evaluation is False, skipping this task 46400 1727204542.56690: _execute() done 46400 1727204542.56693: dumping result to json 46400 1727204542.56695: done dumping result, returning 46400 1727204542.56702: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000b39] 46400 1727204542.56707: sending task result for task 0affcd87-79f5-1303-fda8-000000000b39 46400 1727204542.56799: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b39 46400 1727204542.56802: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204542.56851: no more pending results, returning what we have 46400 1727204542.56855: results queue empty 46400 1727204542.56856: checking for any_errors_fatal 46400 1727204542.56865: done checking for any_errors_fatal 46400 1727204542.56866: checking for max_fail_percentage 46400 1727204542.56868: done checking for max_fail_percentage 46400 1727204542.56869: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.56869: done checking to see if all hosts have failed 46400 1727204542.56870: getting the remaining hosts for this loop 46400 1727204542.56872: done getting the remaining hosts for this loop 46400 1727204542.56875: getting the next task for host managed-node2 46400 1727204542.56884: done getting next task for host managed-node2 46400 1727204542.56888: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204542.56892: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.56918: getting variables 46400 1727204542.56920: in VariableManager get_vars() 46400 1727204542.56955: Calling all_inventory to load vars for managed-node2 46400 1727204542.56957: Calling groups_inventory to load vars for managed-node2 46400 1727204542.56959: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.56971: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.56973: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.56976: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.57872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.58785: done with get_vars() 46400 1727204542.58801: done getting variables 46400 1727204542.58844: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.058) 0:00:32.873 ***** 46400 1727204542.58871: entering _queue_task() for managed-node2/fail 46400 1727204542.59096: worker is 1 (out of 1 available) 46400 1727204542.59110: exiting _queue_task() for managed-node2/fail 46400 1727204542.59124: done queuing things up, now waiting for results queue to drain 46400 1727204542.59126: waiting for pending results... 46400 1727204542.59314: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204542.59415: in run() - task 0affcd87-79f5-1303-fda8-000000000b3a 46400 1727204542.59426: variable 'ansible_search_path' from source: unknown 46400 1727204542.59435: variable 'ansible_search_path' from source: unknown 46400 1727204542.59470: calling self._execute() 46400 1727204542.59542: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.59546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.59555: variable 'omit' from source: magic vars 46400 1727204542.59831: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.59841: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.59929: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204542.60062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.61669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.61723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.61749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.61778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.61798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.61857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.61882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.61900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.61929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.61942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.61977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.61993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.62009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.62037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.62051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.62084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.62100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.62116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.62142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.62154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.62271: variable 'network_connections' from source: include params 46400 1727204542.62280: variable 'interface' from source: play vars 46400 1727204542.62328: variable 'interface' from source: play vars 46400 1727204542.62384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204542.62497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204542.62531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204542.62553: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204542.62580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204542.62610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204542.62627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204542.62644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.62661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204542.62712: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204542.62872: variable 'network_connections' from source: include params 46400 1727204542.62875: variable 'interface' from source: play vars 46400 1727204542.62920: variable 'interface' from source: play vars 46400 1727204542.62944: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204542.62948: when evaluation is False, skipping this task 46400 1727204542.62951: _execute() done 46400 1727204542.62954: dumping result to json 46400 1727204542.62956: done dumping result, returning 46400 1727204542.62965: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000b3a] 46400 1727204542.62971: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3a 46400 1727204542.63066: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3a 46400 1727204542.63069: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204542.63117: no more pending results, returning what we have 46400 1727204542.63125: results queue empty 46400 1727204542.63126: checking for any_errors_fatal 46400 1727204542.63131: done checking for any_errors_fatal 46400 1727204542.63132: checking for max_fail_percentage 46400 1727204542.63133: done checking for max_fail_percentage 46400 1727204542.63134: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.63135: done checking to see if all hosts have failed 46400 1727204542.63136: getting the remaining hosts for this loop 46400 1727204542.63137: done getting the remaining hosts for this loop 46400 1727204542.63141: getting the next task for host managed-node2 46400 1727204542.63149: done getting next task for host managed-node2 46400 1727204542.63153: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204542.63158: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.63178: getting variables 46400 1727204542.63179: in VariableManager get_vars() 46400 1727204542.63212: Calling all_inventory to load vars for managed-node2 46400 1727204542.63214: Calling groups_inventory to load vars for managed-node2 46400 1727204542.63221: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.63234: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.63236: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.63240: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.64047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.65087: done with get_vars() 46400 1727204542.65102: done getting variables 46400 1727204542.65143: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.062) 0:00:32.936 ***** 46400 1727204542.65172: entering _queue_task() for managed-node2/package 46400 1727204542.65398: worker is 1 (out of 1 available) 46400 1727204542.65414: exiting _queue_task() for managed-node2/package 46400 1727204542.65426: done queuing things up, now waiting for results queue to drain 46400 1727204542.65428: waiting for pending results... 46400 1727204542.65621: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204542.65720: in run() - task 0affcd87-79f5-1303-fda8-000000000b3b 46400 1727204542.65729: variable 'ansible_search_path' from source: unknown 46400 1727204542.65734: variable 'ansible_search_path' from source: unknown 46400 1727204542.65763: calling self._execute() 46400 1727204542.65834: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.65840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.65849: variable 'omit' from source: magic vars 46400 1727204542.66116: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.66126: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.66264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204542.66460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204542.66499: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204542.66523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204542.66579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204542.66656: variable 'network_packages' from source: role '' defaults 46400 1727204542.66734: variable '__network_provider_setup' from source: role '' defaults 46400 1727204542.66742: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204542.66792: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204542.66798: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204542.66846: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204542.66961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.68411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.68457: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.68487: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.68512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.68532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.68603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.68622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.68640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.68674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.68686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.68717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.68734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.68750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.68783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.68794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.68943: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204542.69023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.69041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.69057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.69089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.69099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.69162: variable 'ansible_python' from source: facts 46400 1727204542.69178: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204542.69237: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204542.69296: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204542.69383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.69399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.69417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.69445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.69456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.69492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.69512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.69534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.69558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.69572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.69675: variable 'network_connections' from source: include params 46400 1727204542.69678: variable 'interface' from source: play vars 46400 1727204542.69748: variable 'interface' from source: play vars 46400 1727204542.69800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204542.69819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204542.69839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.69866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204542.69904: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204542.70085: variable 'network_connections' from source: include params 46400 1727204542.70089: variable 'interface' from source: play vars 46400 1727204542.70157: variable 'interface' from source: play vars 46400 1727204542.70200: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204542.70256: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204542.70457: variable 'network_connections' from source: include params 46400 1727204542.70461: variable 'interface' from source: play vars 46400 1727204542.70510: variable 'interface' from source: play vars 46400 1727204542.70528: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204542.70587: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204542.70787: variable 'network_connections' from source: include params 46400 1727204542.70790: variable 'interface' from source: play vars 46400 1727204542.70837: variable 'interface' from source: play vars 46400 1727204542.70886: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204542.70932: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204542.70935: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204542.70981: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204542.71118: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204542.71422: variable 'network_connections' from source: include params 46400 1727204542.71425: variable 'interface' from source: play vars 46400 1727204542.71471: variable 'interface' from source: play vars 46400 1727204542.71478: variable 'ansible_distribution' from source: facts 46400 1727204542.71486: variable '__network_rh_distros' from source: role '' defaults 46400 1727204542.71488: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.71510: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204542.71620: variable 'ansible_distribution' from source: facts 46400 1727204542.71624: variable '__network_rh_distros' from source: role '' defaults 46400 1727204542.71627: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.71635: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204542.71746: variable 'ansible_distribution' from source: facts 46400 1727204542.71750: variable '__network_rh_distros' from source: role '' defaults 46400 1727204542.71754: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.71784: variable 'network_provider' from source: set_fact 46400 1727204542.71797: variable 'ansible_facts' from source: unknown 46400 1727204542.72257: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204542.72261: when evaluation is False, skipping this task 46400 1727204542.72263: _execute() done 46400 1727204542.72269: dumping result to json 46400 1727204542.72271: done dumping result, returning 46400 1727204542.72280: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000000b3b] 46400 1727204542.72285: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3b 46400 1727204542.72389: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3b 46400 1727204542.72392: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204542.72437: no more pending results, returning what we have 46400 1727204542.72446: results queue empty 46400 1727204542.72447: checking for any_errors_fatal 46400 1727204542.72457: done checking for any_errors_fatal 46400 1727204542.72458: checking for max_fail_percentage 46400 1727204542.72460: done checking for max_fail_percentage 46400 1727204542.72461: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.72461: done checking to see if all hosts have failed 46400 1727204542.72462: getting the remaining hosts for this loop 46400 1727204542.72466: done getting the remaining hosts for this loop 46400 1727204542.72470: getting the next task for host managed-node2 46400 1727204542.72479: done getting next task for host managed-node2 46400 1727204542.72483: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204542.72488: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.72509: getting variables 46400 1727204542.72510: in VariableManager get_vars() 46400 1727204542.72554: Calling all_inventory to load vars for managed-node2 46400 1727204542.72557: Calling groups_inventory to load vars for managed-node2 46400 1727204542.72563: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.72575: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.72577: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.72580: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.73411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.74353: done with get_vars() 46400 1727204542.74376: done getting variables 46400 1727204542.74424: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.092) 0:00:33.029 ***** 46400 1727204542.74451: entering _queue_task() for managed-node2/package 46400 1727204542.74706: worker is 1 (out of 1 available) 46400 1727204542.74721: exiting _queue_task() for managed-node2/package 46400 1727204542.74734: done queuing things up, now waiting for results queue to drain 46400 1727204542.74736: waiting for pending results... 46400 1727204542.74929: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204542.75029: in run() - task 0affcd87-79f5-1303-fda8-000000000b3c 46400 1727204542.75041: variable 'ansible_search_path' from source: unknown 46400 1727204542.75045: variable 'ansible_search_path' from source: unknown 46400 1727204542.75079: calling self._execute() 46400 1727204542.75161: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.75170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.75179: variable 'omit' from source: magic vars 46400 1727204542.75457: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.75470: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.75556: variable 'network_state' from source: role '' defaults 46400 1727204542.75568: Evaluated conditional (network_state != {}): False 46400 1727204542.75571: when evaluation is False, skipping this task 46400 1727204542.75574: _execute() done 46400 1727204542.75576: dumping result to json 46400 1727204542.75579: done dumping result, returning 46400 1727204542.75586: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000b3c] 46400 1727204542.75593: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3c 46400 1727204542.75691: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3c 46400 1727204542.75694: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204542.75771: no more pending results, returning what we have 46400 1727204542.75775: results queue empty 46400 1727204542.75776: checking for any_errors_fatal 46400 1727204542.75783: done checking for any_errors_fatal 46400 1727204542.75783: checking for max_fail_percentage 46400 1727204542.75785: done checking for max_fail_percentage 46400 1727204542.75786: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.75787: done checking to see if all hosts have failed 46400 1727204542.75788: getting the remaining hosts for this loop 46400 1727204542.75790: done getting the remaining hosts for this loop 46400 1727204542.75793: getting the next task for host managed-node2 46400 1727204542.75802: done getting next task for host managed-node2 46400 1727204542.75807: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204542.75816: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.75839: getting variables 46400 1727204542.75841: in VariableManager get_vars() 46400 1727204542.75874: Calling all_inventory to load vars for managed-node2 46400 1727204542.75877: Calling groups_inventory to load vars for managed-node2 46400 1727204542.75879: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.75888: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.75890: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.75893: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.77153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.78704: done with get_vars() 46400 1727204542.78741: done getting variables 46400 1727204542.78810: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.043) 0:00:33.073 ***** 46400 1727204542.78850: entering _queue_task() for managed-node2/package 46400 1727204542.79204: worker is 1 (out of 1 available) 46400 1727204542.79217: exiting _queue_task() for managed-node2/package 46400 1727204542.79230: done queuing things up, now waiting for results queue to drain 46400 1727204542.79231: waiting for pending results... 46400 1727204542.79531: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204542.79702: in run() - task 0affcd87-79f5-1303-fda8-000000000b3d 46400 1727204542.79720: variable 'ansible_search_path' from source: unknown 46400 1727204542.79727: variable 'ansible_search_path' from source: unknown 46400 1727204542.79770: calling self._execute() 46400 1727204542.79872: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.79889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.79904: variable 'omit' from source: magic vars 46400 1727204542.80302: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.80327: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.80457: variable 'network_state' from source: role '' defaults 46400 1727204542.80479: Evaluated conditional (network_state != {}): False 46400 1727204542.80487: when evaluation is False, skipping this task 46400 1727204542.80494: _execute() done 46400 1727204542.80500: dumping result to json 46400 1727204542.80507: done dumping result, returning 46400 1727204542.80518: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000b3d] 46400 1727204542.80527: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204542.80695: no more pending results, returning what we have 46400 1727204542.80700: results queue empty 46400 1727204542.80702: checking for any_errors_fatal 46400 1727204542.80710: done checking for any_errors_fatal 46400 1727204542.80710: checking for max_fail_percentage 46400 1727204542.80713: done checking for max_fail_percentage 46400 1727204542.80714: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.80715: done checking to see if all hosts have failed 46400 1727204542.80715: getting the remaining hosts for this loop 46400 1727204542.80717: done getting the remaining hosts for this loop 46400 1727204542.80721: getting the next task for host managed-node2 46400 1727204542.80732: done getting next task for host managed-node2 46400 1727204542.80737: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204542.80743: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.80770: getting variables 46400 1727204542.80773: in VariableManager get_vars() 46400 1727204542.80816: Calling all_inventory to load vars for managed-node2 46400 1727204542.80819: Calling groups_inventory to load vars for managed-node2 46400 1727204542.80822: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.80835: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.80840: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.80843: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.81958: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3d 46400 1727204542.81962: WORKER PROCESS EXITING 46400 1727204542.81981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.82918: done with get_vars() 46400 1727204542.82936: done getting variables 46400 1727204542.82994: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.041) 0:00:33.114 ***** 46400 1727204542.83036: entering _queue_task() for managed-node2/service 46400 1727204542.83365: worker is 1 (out of 1 available) 46400 1727204542.83379: exiting _queue_task() for managed-node2/service 46400 1727204542.83392: done queuing things up, now waiting for results queue to drain 46400 1727204542.83393: waiting for pending results... 46400 1727204542.83692: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204542.83832: in run() - task 0affcd87-79f5-1303-fda8-000000000b3e 46400 1727204542.83853: variable 'ansible_search_path' from source: unknown 46400 1727204542.83865: variable 'ansible_search_path' from source: unknown 46400 1727204542.83906: calling self._execute() 46400 1727204542.84011: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.84023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.84037: variable 'omit' from source: magic vars 46400 1727204542.84440: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.84457: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.84588: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204542.84802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.87596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.87678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.87735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.87785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.87816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.87908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.87943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.87984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.88031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.88051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.88108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.88137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.88172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.88219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.88237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.88290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.88318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.88346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.88399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.88420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.88604: variable 'network_connections' from source: include params 46400 1727204542.88626: variable 'interface' from source: play vars 46400 1727204542.88706: variable 'interface' from source: play vars 46400 1727204542.88785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204542.88963: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204542.89006: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204542.89041: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204542.89094: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204542.89140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204542.89177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204542.89207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.89240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204542.89313: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204542.89580: variable 'network_connections' from source: include params 46400 1727204542.89592: variable 'interface' from source: play vars 46400 1727204542.89663: variable 'interface' from source: play vars 46400 1727204542.89704: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204542.89715: when evaluation is False, skipping this task 46400 1727204542.89721: _execute() done 46400 1727204542.89727: dumping result to json 46400 1727204542.89733: done dumping result, returning 46400 1727204542.89744: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000b3e] 46400 1727204542.89753: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3e skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204542.89919: no more pending results, returning what we have 46400 1727204542.89924: results queue empty 46400 1727204542.89925: checking for any_errors_fatal 46400 1727204542.89933: done checking for any_errors_fatal 46400 1727204542.89934: checking for max_fail_percentage 46400 1727204542.89935: done checking for max_fail_percentage 46400 1727204542.89936: checking to see if all hosts have failed and the running result is not ok 46400 1727204542.89937: done checking to see if all hosts have failed 46400 1727204542.89938: getting the remaining hosts for this loop 46400 1727204542.89940: done getting the remaining hosts for this loop 46400 1727204542.89944: getting the next task for host managed-node2 46400 1727204542.89954: done getting next task for host managed-node2 46400 1727204542.89958: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204542.89967: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204542.89988: getting variables 46400 1727204542.89990: in VariableManager get_vars() 46400 1727204542.90029: Calling all_inventory to load vars for managed-node2 46400 1727204542.90032: Calling groups_inventory to load vars for managed-node2 46400 1727204542.90035: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204542.90046: Calling all_plugins_play to load vars for managed-node2 46400 1727204542.90049: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204542.90052: Calling groups_plugins_play to load vars for managed-node2 46400 1727204542.91083: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3e 46400 1727204542.91087: WORKER PROCESS EXITING 46400 1727204542.92008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204542.93667: done with get_vars() 46400 1727204542.93693: done getting variables 46400 1727204542.93755: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:22 -0400 (0:00:00.107) 0:00:33.222 ***** 46400 1727204542.93797: entering _queue_task() for managed-node2/service 46400 1727204542.94134: worker is 1 (out of 1 available) 46400 1727204542.94148: exiting _queue_task() for managed-node2/service 46400 1727204542.94165: done queuing things up, now waiting for results queue to drain 46400 1727204542.94167: waiting for pending results... 46400 1727204542.94468: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204542.94623: in run() - task 0affcd87-79f5-1303-fda8-000000000b3f 46400 1727204542.94643: variable 'ansible_search_path' from source: unknown 46400 1727204542.94653: variable 'ansible_search_path' from source: unknown 46400 1727204542.94698: calling self._execute() 46400 1727204542.94804: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204542.94817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204542.94837: variable 'omit' from source: magic vars 46400 1727204542.95230: variable 'ansible_distribution_major_version' from source: facts 46400 1727204542.95247: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204542.95415: variable 'network_provider' from source: set_fact 46400 1727204542.95425: variable 'network_state' from source: role '' defaults 46400 1727204542.95439: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204542.95448: variable 'omit' from source: magic vars 46400 1727204542.95521: variable 'omit' from source: magic vars 46400 1727204542.95549: variable 'network_service_name' from source: role '' defaults 46400 1727204542.95625: variable 'network_service_name' from source: role '' defaults 46400 1727204542.95741: variable '__network_provider_setup' from source: role '' defaults 46400 1727204542.95751: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204542.95824: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204542.95837: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204542.95905: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204542.96135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204542.98520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204542.98608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204542.98654: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204542.98698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204542.98730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204542.98818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.98858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.98895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.98940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.98979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.99012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.99028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.99045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.99076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.99089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.99258: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204542.99342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.99359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.99380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.99404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.99418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.99484: variable 'ansible_python' from source: facts 46400 1727204542.99496: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204542.99556: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204542.99615: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204542.99704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.99720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.99737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.99772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.99782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.99815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204542.99834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204542.99857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204542.99889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204542.99899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204542.99998: variable 'network_connections' from source: include params 46400 1727204543.00005: variable 'interface' from source: play vars 46400 1727204543.00057: variable 'interface' from source: play vars 46400 1727204543.00135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204543.00272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204543.00310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204543.00340: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204543.00373: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204543.00419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204543.00440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204543.00462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.00488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204543.00538: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204543.00793: variable 'network_connections' from source: include params 46400 1727204543.00805: variable 'interface' from source: play vars 46400 1727204543.00879: variable 'interface' from source: play vars 46400 1727204543.00929: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204543.01012: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204543.01295: variable 'network_connections' from source: include params 46400 1727204543.01305: variable 'interface' from source: play vars 46400 1727204543.01376: variable 'interface' from source: play vars 46400 1727204543.01403: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204543.01484: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204543.01771: variable 'network_connections' from source: include params 46400 1727204543.01782: variable 'interface' from source: play vars 46400 1727204543.01853: variable 'interface' from source: play vars 46400 1727204543.01919: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204543.01986: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204543.01998: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204543.02056: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204543.02201: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204543.02539: variable 'network_connections' from source: include params 46400 1727204543.02542: variable 'interface' from source: play vars 46400 1727204543.02586: variable 'interface' from source: play vars 46400 1727204543.02594: variable 'ansible_distribution' from source: facts 46400 1727204543.02596: variable '__network_rh_distros' from source: role '' defaults 46400 1727204543.02603: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.02624: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204543.02741: variable 'ansible_distribution' from source: facts 46400 1727204543.02744: variable '__network_rh_distros' from source: role '' defaults 46400 1727204543.02747: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.02758: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204543.02873: variable 'ansible_distribution' from source: facts 46400 1727204543.02877: variable '__network_rh_distros' from source: role '' defaults 46400 1727204543.02882: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.02908: variable 'network_provider' from source: set_fact 46400 1727204543.02924: variable 'omit' from source: magic vars 46400 1727204543.02945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204543.02969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204543.02992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204543.03005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204543.03013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204543.03035: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204543.03039: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.03041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.03115: Set connection var ansible_shell_type to sh 46400 1727204543.03122: Set connection var ansible_shell_executable to /bin/sh 46400 1727204543.03127: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204543.03132: Set connection var ansible_connection to ssh 46400 1727204543.03137: Set connection var ansible_pipelining to False 46400 1727204543.03142: Set connection var ansible_timeout to 10 46400 1727204543.03167: variable 'ansible_shell_executable' from source: unknown 46400 1727204543.03170: variable 'ansible_connection' from source: unknown 46400 1727204543.03172: variable 'ansible_module_compression' from source: unknown 46400 1727204543.03174: variable 'ansible_shell_type' from source: unknown 46400 1727204543.03177: variable 'ansible_shell_executable' from source: unknown 46400 1727204543.03179: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.03181: variable 'ansible_pipelining' from source: unknown 46400 1727204543.03184: variable 'ansible_timeout' from source: unknown 46400 1727204543.03192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.03267: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204543.03275: variable 'omit' from source: magic vars 46400 1727204543.03278: starting attempt loop 46400 1727204543.03281: running the handler 46400 1727204543.03339: variable 'ansible_facts' from source: unknown 46400 1727204543.03951: _low_level_execute_command(): starting 46400 1727204543.03970: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204543.04660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204543.04678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.04694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.04714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.04757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.04771: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204543.04788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.04808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204543.04821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204543.04831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204543.04844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.04858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.04877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.04889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.04899: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204543.04910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.04986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.05008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.05023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.05098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.06745: stdout chunk (state=3): >>>/root <<< 46400 1727204543.06849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.06907: stderr chunk (state=3): >>><<< 46400 1727204543.06910: stdout chunk (state=3): >>><<< 46400 1727204543.06928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204543.06938: _low_level_execute_command(): starting 46400 1727204543.06943: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502 `" && echo ansible-tmp-1727204543.069275-49146-273297550589502="` echo /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502 `" ) && sleep 0' 46400 1727204543.07401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.07405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.07438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.07443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.07446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.07527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.07531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.07615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.09477: stdout chunk (state=3): >>>ansible-tmp-1727204543.069275-49146-273297550589502=/root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502 <<< 46400 1727204543.09579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.09656: stderr chunk (state=3): >>><<< 46400 1727204543.09672: stdout chunk (state=3): >>><<< 46400 1727204543.09767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204543.069275-49146-273297550589502=/root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204543.09774: variable 'ansible_module_compression' from source: unknown 46400 1727204543.09823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204543.09896: variable 'ansible_facts' from source: unknown 46400 1727204543.10112: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/AnsiballZ_systemd.py 46400 1727204543.10383: Sending initial data 46400 1727204543.10386: Sent initial data (155 bytes) 46400 1727204543.11084: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204543.11094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.11108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.11123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.11153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.11161: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204543.11192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.11195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204543.11197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.11199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.11256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.11259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.11261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.11303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.12997: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204543.13007: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204543.13015: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 46400 1727204543.13022: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 46400 1727204543.13028: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 46400 1727204543.13038: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 46400 1727204543.13045: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 46400 1727204543.13052: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 46400 1727204543.13058: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204543.13112: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 46400 1727204543.13120: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 46400 1727204543.13126: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 46400 1727204543.13176: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpan3zp14b /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/AnsiballZ_systemd.py <<< 46400 1727204543.13230: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204543.15757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.15928: stderr chunk (state=3): >>><<< 46400 1727204543.15932: stdout chunk (state=3): >>><<< 46400 1727204543.15952: done transferring module to remote 46400 1727204543.15969: _low_level_execute_command(): starting 46400 1727204543.15976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/ /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/AnsiballZ_systemd.py && sleep 0' 46400 1727204543.17949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.17969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.18048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.18074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.18097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204543.18102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.18358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.18385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.18407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.18470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.20196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.20246: stderr chunk (state=3): >>><<< 46400 1727204543.20249: stdout chunk (state=3): >>><<< 46400 1727204543.20262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204543.20269: _low_level_execute_command(): starting 46400 1727204543.20275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/AnsiballZ_systemd.py && sleep 0' 46400 1727204543.20755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.20759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.20799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.20807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.20813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.20824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.20831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.20837: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204543.20843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.20900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.20914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.20921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.20983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.46294: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204543.46341: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "2050832000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204543.46346: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204543.47988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204543.47992: stdout chunk (state=3): >>><<< 46400 1727204543.47994: stderr chunk (state=3): >>><<< 46400 1727204543.48257: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6995968", "MemoryAvailable": "infinity", "CPUUsageNSec": "2050832000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204543.48270: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204543.48274: _low_level_execute_command(): starting 46400 1727204543.48276: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204543.069275-49146-273297550589502/ > /dev/null 2>&1 && sleep 0' 46400 1727204543.48838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204543.48854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.48875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.48894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.48935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.48948: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204543.48968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.48987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204543.48998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204543.49009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204543.49020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.49032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.49047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.49058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.49076: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204543.49090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.49173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.49196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.49212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.49286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.51170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.51173: stdout chunk (state=3): >>><<< 46400 1727204543.51175: stderr chunk (state=3): >>><<< 46400 1727204543.51370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204543.51374: handler run complete 46400 1727204543.51376: attempt loop complete, returning result 46400 1727204543.51378: _execute() done 46400 1727204543.51380: dumping result to json 46400 1727204543.51381: done dumping result, returning 46400 1727204543.51383: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000000b3f] 46400 1727204543.51385: sending task result for task 0affcd87-79f5-1303-fda8-000000000b3f ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204543.51657: no more pending results, returning what we have 46400 1727204543.51663: results queue empty 46400 1727204543.51666: checking for any_errors_fatal 46400 1727204543.51673: done checking for any_errors_fatal 46400 1727204543.51673: checking for max_fail_percentage 46400 1727204543.51675: done checking for max_fail_percentage 46400 1727204543.51676: checking to see if all hosts have failed and the running result is not ok 46400 1727204543.51677: done checking to see if all hosts have failed 46400 1727204543.51678: getting the remaining hosts for this loop 46400 1727204543.51679: done getting the remaining hosts for this loop 46400 1727204543.51683: getting the next task for host managed-node2 46400 1727204543.51690: done getting next task for host managed-node2 46400 1727204543.51694: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204543.51699: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204543.51712: getting variables 46400 1727204543.51714: in VariableManager get_vars() 46400 1727204543.51745: Calling all_inventory to load vars for managed-node2 46400 1727204543.51747: Calling groups_inventory to load vars for managed-node2 46400 1727204543.51749: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204543.51759: Calling all_plugins_play to load vars for managed-node2 46400 1727204543.51765: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204543.51768: Calling groups_plugins_play to load vars for managed-node2 46400 1727204543.52537: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b3f 46400 1727204543.52548: WORKER PROCESS EXITING 46400 1727204543.54523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204543.56935: done with get_vars() 46400 1727204543.56966: done getting variables 46400 1727204543.57034: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:23 -0400 (0:00:00.632) 0:00:33.855 ***** 46400 1727204543.57075: entering _queue_task() for managed-node2/service 46400 1727204543.57437: worker is 1 (out of 1 available) 46400 1727204543.57450: exiting _queue_task() for managed-node2/service 46400 1727204543.57467: done queuing things up, now waiting for results queue to drain 46400 1727204543.57469: waiting for pending results... 46400 1727204543.57788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204543.57943: in run() - task 0affcd87-79f5-1303-fda8-000000000b40 46400 1727204543.57955: variable 'ansible_search_path' from source: unknown 46400 1727204543.57963: variable 'ansible_search_path' from source: unknown 46400 1727204543.58000: calling self._execute() 46400 1727204543.58106: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.58113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.58126: variable 'omit' from source: magic vars 46400 1727204543.58526: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.58538: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204543.58696: variable 'network_provider' from source: set_fact 46400 1727204543.58701: Evaluated conditional (network_provider == "nm"): True 46400 1727204543.58824: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204543.58917: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204543.59112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204543.62567: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204543.62702: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204543.62755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204543.62791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204543.62837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204543.62933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204543.62971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204543.62998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.63050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204543.63067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204543.63113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204543.63136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204543.63174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.63213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204543.63228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204543.63281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204543.63304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204543.63330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.63379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204543.63394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204543.63552: variable 'network_connections' from source: include params 46400 1727204543.63569: variable 'interface' from source: play vars 46400 1727204543.63648: variable 'interface' from source: play vars 46400 1727204543.63734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204543.63920: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204543.63957: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204543.63989: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204543.64021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204543.64071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204543.64093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204543.64117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.64153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204543.64202: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204543.64658: variable 'network_connections' from source: include params 46400 1727204543.64666: variable 'interface' from source: play vars 46400 1727204543.64725: variable 'interface' from source: play vars 46400 1727204543.64777: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204543.64780: when evaluation is False, skipping this task 46400 1727204543.64783: _execute() done 46400 1727204543.64785: dumping result to json 46400 1727204543.64787: done dumping result, returning 46400 1727204543.64796: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000000b40] 46400 1727204543.64806: sending task result for task 0affcd87-79f5-1303-fda8-000000000b40 46400 1727204543.64904: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b40 46400 1727204543.64908: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204543.64954: no more pending results, returning what we have 46400 1727204543.64958: results queue empty 46400 1727204543.64959: checking for any_errors_fatal 46400 1727204543.64991: done checking for any_errors_fatal 46400 1727204543.64992: checking for max_fail_percentage 46400 1727204543.64994: done checking for max_fail_percentage 46400 1727204543.64995: checking to see if all hosts have failed and the running result is not ok 46400 1727204543.64995: done checking to see if all hosts have failed 46400 1727204543.64996: getting the remaining hosts for this loop 46400 1727204543.64998: done getting the remaining hosts for this loop 46400 1727204543.65002: getting the next task for host managed-node2 46400 1727204543.65012: done getting next task for host managed-node2 46400 1727204543.65016: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204543.65021: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204543.65040: getting variables 46400 1727204543.65042: in VariableManager get_vars() 46400 1727204543.65082: Calling all_inventory to load vars for managed-node2 46400 1727204543.65085: Calling groups_inventory to load vars for managed-node2 46400 1727204543.65087: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204543.65098: Calling all_plugins_play to load vars for managed-node2 46400 1727204543.65101: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204543.65103: Calling groups_plugins_play to load vars for managed-node2 46400 1727204543.66934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204543.68918: done with get_vars() 46400 1727204543.68950: done getting variables 46400 1727204543.69023: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:23 -0400 (0:00:00.119) 0:00:33.975 ***** 46400 1727204543.69060: entering _queue_task() for managed-node2/service 46400 1727204543.69425: worker is 1 (out of 1 available) 46400 1727204543.69446: exiting _queue_task() for managed-node2/service 46400 1727204543.69459: done queuing things up, now waiting for results queue to drain 46400 1727204543.69461: waiting for pending results... 46400 1727204543.69883: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204543.70018: in run() - task 0affcd87-79f5-1303-fda8-000000000b41 46400 1727204543.70032: variable 'ansible_search_path' from source: unknown 46400 1727204543.70035: variable 'ansible_search_path' from source: unknown 46400 1727204543.70074: calling self._execute() 46400 1727204543.70183: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.70187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.70203: variable 'omit' from source: magic vars 46400 1727204543.70599: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.70611: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204543.70737: variable 'network_provider' from source: set_fact 46400 1727204543.70748: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204543.70751: when evaluation is False, skipping this task 46400 1727204543.70754: _execute() done 46400 1727204543.70756: dumping result to json 46400 1727204543.70758: done dumping result, returning 46400 1727204543.70767: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000000b41] 46400 1727204543.70781: sending task result for task 0affcd87-79f5-1303-fda8-000000000b41 46400 1727204543.70883: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b41 46400 1727204543.70887: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204543.70935: no more pending results, returning what we have 46400 1727204543.70940: results queue empty 46400 1727204543.70941: checking for any_errors_fatal 46400 1727204543.70950: done checking for any_errors_fatal 46400 1727204543.70950: checking for max_fail_percentage 46400 1727204543.70954: done checking for max_fail_percentage 46400 1727204543.70955: checking to see if all hosts have failed and the running result is not ok 46400 1727204543.70956: done checking to see if all hosts have failed 46400 1727204543.70957: getting the remaining hosts for this loop 46400 1727204543.70959: done getting the remaining hosts for this loop 46400 1727204543.70963: getting the next task for host managed-node2 46400 1727204543.70975: done getting next task for host managed-node2 46400 1727204543.70980: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204543.70986: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204543.71012: getting variables 46400 1727204543.71014: in VariableManager get_vars() 46400 1727204543.71055: Calling all_inventory to load vars for managed-node2 46400 1727204543.71058: Calling groups_inventory to load vars for managed-node2 46400 1727204543.71061: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204543.71077: Calling all_plugins_play to load vars for managed-node2 46400 1727204543.71080: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204543.71083: Calling groups_plugins_play to load vars for managed-node2 46400 1727204543.72984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204543.74732: done with get_vars() 46400 1727204543.74767: done getting variables 46400 1727204543.74826: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:23 -0400 (0:00:00.058) 0:00:34.033 ***** 46400 1727204543.74875: entering _queue_task() for managed-node2/copy 46400 1727204543.75224: worker is 1 (out of 1 available) 46400 1727204543.75236: exiting _queue_task() for managed-node2/copy 46400 1727204543.75249: done queuing things up, now waiting for results queue to drain 46400 1727204543.75251: waiting for pending results... 46400 1727204543.75572: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204543.75722: in run() - task 0affcd87-79f5-1303-fda8-000000000b42 46400 1727204543.75741: variable 'ansible_search_path' from source: unknown 46400 1727204543.75748: variable 'ansible_search_path' from source: unknown 46400 1727204543.75786: calling self._execute() 46400 1727204543.75887: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.75892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.75902: variable 'omit' from source: magic vars 46400 1727204543.76298: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.76310: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204543.76431: variable 'network_provider' from source: set_fact 46400 1727204543.76436: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204543.76440: when evaluation is False, skipping this task 46400 1727204543.76443: _execute() done 46400 1727204543.76445: dumping result to json 46400 1727204543.76448: done dumping result, returning 46400 1727204543.76462: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000000b42] 46400 1727204543.76467: sending task result for task 0affcd87-79f5-1303-fda8-000000000b42 46400 1727204543.76572: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b42 46400 1727204543.76576: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204543.76625: no more pending results, returning what we have 46400 1727204543.76630: results queue empty 46400 1727204543.76631: checking for any_errors_fatal 46400 1727204543.76638: done checking for any_errors_fatal 46400 1727204543.76639: checking for max_fail_percentage 46400 1727204543.76640: done checking for max_fail_percentage 46400 1727204543.76641: checking to see if all hosts have failed and the running result is not ok 46400 1727204543.76642: done checking to see if all hosts have failed 46400 1727204543.76643: getting the remaining hosts for this loop 46400 1727204543.76645: done getting the remaining hosts for this loop 46400 1727204543.76649: getting the next task for host managed-node2 46400 1727204543.76658: done getting next task for host managed-node2 46400 1727204543.76662: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204543.76669: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204543.76690: getting variables 46400 1727204543.76692: in VariableManager get_vars() 46400 1727204543.76731: Calling all_inventory to load vars for managed-node2 46400 1727204543.76734: Calling groups_inventory to load vars for managed-node2 46400 1727204543.76737: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204543.76750: Calling all_plugins_play to load vars for managed-node2 46400 1727204543.76754: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204543.76758: Calling groups_plugins_play to load vars for managed-node2 46400 1727204543.78453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204543.80190: done with get_vars() 46400 1727204543.80214: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:23 -0400 (0:00:00.054) 0:00:34.087 ***** 46400 1727204543.80311: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204543.80629: worker is 1 (out of 1 available) 46400 1727204543.80642: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204543.80656: done queuing things up, now waiting for results queue to drain 46400 1727204543.80657: waiting for pending results... 46400 1727204543.80956: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204543.81099: in run() - task 0affcd87-79f5-1303-fda8-000000000b43 46400 1727204543.81121: variable 'ansible_search_path' from source: unknown 46400 1727204543.81125: variable 'ansible_search_path' from source: unknown 46400 1727204543.81166: calling self._execute() 46400 1727204543.81258: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.81267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.81273: variable 'omit' from source: magic vars 46400 1727204543.81668: variable 'ansible_distribution_major_version' from source: facts 46400 1727204543.81679: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204543.81687: variable 'omit' from source: magic vars 46400 1727204543.81757: variable 'omit' from source: magic vars 46400 1727204543.81929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204543.84650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204543.84720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204543.84753: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204543.84787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204543.84822: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204543.84908: variable 'network_provider' from source: set_fact 46400 1727204543.85053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204543.85084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204543.85109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204543.85166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204543.85178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204543.85258: variable 'omit' from source: magic vars 46400 1727204543.85383: variable 'omit' from source: magic vars 46400 1727204543.85491: variable 'network_connections' from source: include params 46400 1727204543.85502: variable 'interface' from source: play vars 46400 1727204543.85557: variable 'interface' from source: play vars 46400 1727204543.85824: variable 'omit' from source: magic vars 46400 1727204543.85831: variable '__lsr_ansible_managed' from source: task vars 46400 1727204543.85888: variable '__lsr_ansible_managed' from source: task vars 46400 1727204543.86326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204543.86730: Loaded config def from plugin (lookup/template) 46400 1727204543.86733: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204543.86765: File lookup term: get_ansible_managed.j2 46400 1727204543.86770: variable 'ansible_search_path' from source: unknown 46400 1727204543.86783: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204543.86797: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204543.86813: variable 'ansible_search_path' from source: unknown 46400 1727204543.96025: variable 'ansible_managed' from source: unknown 46400 1727204543.96117: variable 'omit' from source: magic vars 46400 1727204543.96137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204543.96157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204543.96176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204543.96190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204543.96198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204543.96219: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204543.96222: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.96225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.96289: Set connection var ansible_shell_type to sh 46400 1727204543.96297: Set connection var ansible_shell_executable to /bin/sh 46400 1727204543.96302: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204543.96307: Set connection var ansible_connection to ssh 46400 1727204543.96312: Set connection var ansible_pipelining to False 46400 1727204543.96317: Set connection var ansible_timeout to 10 46400 1727204543.96336: variable 'ansible_shell_executable' from source: unknown 46400 1727204543.96339: variable 'ansible_connection' from source: unknown 46400 1727204543.96341: variable 'ansible_module_compression' from source: unknown 46400 1727204543.96343: variable 'ansible_shell_type' from source: unknown 46400 1727204543.96345: variable 'ansible_shell_executable' from source: unknown 46400 1727204543.96349: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204543.96351: variable 'ansible_pipelining' from source: unknown 46400 1727204543.96354: variable 'ansible_timeout' from source: unknown 46400 1727204543.96358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204543.96452: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204543.96469: variable 'omit' from source: magic vars 46400 1727204543.96472: starting attempt loop 46400 1727204543.96474: running the handler 46400 1727204543.96482: _low_level_execute_command(): starting 46400 1727204543.96488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204543.96988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.97004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.97017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.97029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.97047: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.97116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.97135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.97138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204543.97193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204543.98869: stdout chunk (state=3): >>>/root <<< 46400 1727204543.99043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204543.99046: stdout chunk (state=3): >>><<< 46400 1727204543.99049: stderr chunk (state=3): >>><<< 46400 1727204543.99173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204543.99178: _low_level_execute_command(): starting 46400 1727204543.99181: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090 `" && echo ansible-tmp-1727204543.9907365-49178-106836084027090="` echo /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090 `" ) && sleep 0' 46400 1727204543.99724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204543.99736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.99746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.99758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.99792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.99800: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204543.99809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.99821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204543.99833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204543.99844: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204543.99851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204543.99862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204543.99878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204543.99886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204543.99893: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204543.99901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204543.99958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204543.99983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204543.99986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.00032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.01895: stdout chunk (state=3): >>>ansible-tmp-1727204543.9907365-49178-106836084027090=/root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090 <<< 46400 1727204544.02019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.02094: stderr chunk (state=3): >>><<< 46400 1727204544.02113: stdout chunk (state=3): >>><<< 46400 1727204544.02192: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204543.9907365-49178-106836084027090=/root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.02196: variable 'ansible_module_compression' from source: unknown 46400 1727204544.02241: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204544.02268: variable 'ansible_facts' from source: unknown 46400 1727204544.02343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/AnsiballZ_network_connections.py 46400 1727204544.02442: Sending initial data 46400 1727204544.02447: Sent initial data (168 bytes) 46400 1727204544.03101: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.03107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.03138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.03142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.03170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.03174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.03221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.03225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.03280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.04988: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204544.05002: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204544.05011: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 46400 1727204544.05014: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 46400 1727204544.05032: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 46400 1727204544.05035: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 46400 1727204544.05048: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 46400 1727204544.05067: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 46400 1727204544.05069: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204544.05135: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 46400 1727204544.05153: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204544.05197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmppkgjspr5 /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/AnsiballZ_network_connections.py <<< 46400 1727204544.05247: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204544.06901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.07071: stderr chunk (state=3): >>><<< 46400 1727204544.07080: stdout chunk (state=3): >>><<< 46400 1727204544.07113: done transferring module to remote 46400 1727204544.07124: _low_level_execute_command(): starting 46400 1727204544.07139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/ /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/AnsiballZ_network_connections.py && sleep 0' 46400 1727204544.07884: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.07900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.07905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.07914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.07955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.07958: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.07981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.08002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.08021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.08024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.08031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.08050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.08058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.08071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.08078: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.08087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.08161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.08176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.08184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.08241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.09932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.09977: stderr chunk (state=3): >>><<< 46400 1727204544.09980: stdout chunk (state=3): >>><<< 46400 1727204544.09993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.09996: _low_level_execute_command(): starting 46400 1727204544.10001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/AnsiballZ_network_connections.py && sleep 0' 46400 1727204544.10427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.10434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.10481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.10485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.10487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.10533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.10537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.10549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.10603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.35414: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204544.38476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204544.38543: stderr chunk (state=3): >>><<< 46400 1727204544.38547: stdout chunk (state=3): >>><<< 46400 1727204544.38573: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204544.38615: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204544.38625: _low_level_execute_command(): starting 46400 1727204544.38630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204543.9907365-49178-106836084027090/ > /dev/null 2>&1 && sleep 0' 46400 1727204544.39277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.39287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.39298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.39313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.39355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.39367: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.39376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.39390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.39398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.39405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.39412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.39421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.39433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.39440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.39446: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.39456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.39529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.39544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.39553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.39736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.41484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.41492: stdout chunk (state=3): >>><<< 46400 1727204544.41494: stderr chunk (state=3): >>><<< 46400 1727204544.41511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.41517: handler run complete 46400 1727204544.41552: attempt loop complete, returning result 46400 1727204544.41556: _execute() done 46400 1727204544.41559: dumping result to json 46400 1727204544.41566: done dumping result, returning 46400 1727204544.41576: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000000b43] 46400 1727204544.41581: sending task result for task 0affcd87-79f5-1303-fda8-000000000b43 46400 1727204544.41700: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b43 46400 1727204544.41702: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 46400 1727204544.41803: no more pending results, returning what we have 46400 1727204544.41806: results queue empty 46400 1727204544.41807: checking for any_errors_fatal 46400 1727204544.41814: done checking for any_errors_fatal 46400 1727204544.41814: checking for max_fail_percentage 46400 1727204544.41816: done checking for max_fail_percentage 46400 1727204544.41817: checking to see if all hosts have failed and the running result is not ok 46400 1727204544.41818: done checking to see if all hosts have failed 46400 1727204544.41818: getting the remaining hosts for this loop 46400 1727204544.41820: done getting the remaining hosts for this loop 46400 1727204544.41823: getting the next task for host managed-node2 46400 1727204544.41830: done getting next task for host managed-node2 46400 1727204544.41834: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204544.41838: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204544.41850: getting variables 46400 1727204544.41851: in VariableManager get_vars() 46400 1727204544.41891: Calling all_inventory to load vars for managed-node2 46400 1727204544.41893: Calling groups_inventory to load vars for managed-node2 46400 1727204544.41895: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204544.41905: Calling all_plugins_play to load vars for managed-node2 46400 1727204544.41907: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204544.41910: Calling groups_plugins_play to load vars for managed-node2 46400 1727204544.43520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204544.45480: done with get_vars() 46400 1727204544.45513: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:24 -0400 (0:00:00.652) 0:00:34.740 ***** 46400 1727204544.45614: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204544.45983: worker is 1 (out of 1 available) 46400 1727204544.45995: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204544.46010: done queuing things up, now waiting for results queue to drain 46400 1727204544.46012: waiting for pending results... 46400 1727204544.46326: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204544.46497: in run() - task 0affcd87-79f5-1303-fda8-000000000b44 46400 1727204544.46521: variable 'ansible_search_path' from source: unknown 46400 1727204544.46529: variable 'ansible_search_path' from source: unknown 46400 1727204544.46581: calling self._execute() 46400 1727204544.46693: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.46709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.46725: variable 'omit' from source: magic vars 46400 1727204544.48066: variable 'ansible_distribution_major_version' from source: facts 46400 1727204544.48085: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204544.48332: variable 'network_state' from source: role '' defaults 46400 1727204544.48375: Evaluated conditional (network_state != {}): False 46400 1727204544.48473: when evaluation is False, skipping this task 46400 1727204544.48481: _execute() done 46400 1727204544.48488: dumping result to json 46400 1727204544.48496: done dumping result, returning 46400 1727204544.48508: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000000b44] 46400 1727204544.48519: sending task result for task 0affcd87-79f5-1303-fda8-000000000b44 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204544.48682: no more pending results, returning what we have 46400 1727204544.48687: results queue empty 46400 1727204544.48688: checking for any_errors_fatal 46400 1727204544.48704: done checking for any_errors_fatal 46400 1727204544.48705: checking for max_fail_percentage 46400 1727204544.48707: done checking for max_fail_percentage 46400 1727204544.48708: checking to see if all hosts have failed and the running result is not ok 46400 1727204544.48709: done checking to see if all hosts have failed 46400 1727204544.48710: getting the remaining hosts for this loop 46400 1727204544.48712: done getting the remaining hosts for this loop 46400 1727204544.48716: getting the next task for host managed-node2 46400 1727204544.48726: done getting next task for host managed-node2 46400 1727204544.48731: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204544.48737: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204544.48764: getting variables 46400 1727204544.48767: in VariableManager get_vars() 46400 1727204544.48807: Calling all_inventory to load vars for managed-node2 46400 1727204544.48810: Calling groups_inventory to load vars for managed-node2 46400 1727204544.48813: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204544.48826: Calling all_plugins_play to load vars for managed-node2 46400 1727204544.48829: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204544.48833: Calling groups_plugins_play to load vars for managed-node2 46400 1727204544.49815: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b44 46400 1727204544.49819: WORKER PROCESS EXITING 46400 1727204544.51335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204544.53539: done with get_vars() 46400 1727204544.53571: done getting variables 46400 1727204544.53636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:24 -0400 (0:00:00.080) 0:00:34.821 ***** 46400 1727204544.53680: entering _queue_task() for managed-node2/debug 46400 1727204544.54035: worker is 1 (out of 1 available) 46400 1727204544.54049: exiting _queue_task() for managed-node2/debug 46400 1727204544.54070: done queuing things up, now waiting for results queue to drain 46400 1727204544.54072: waiting for pending results... 46400 1727204544.54328: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204544.54426: in run() - task 0affcd87-79f5-1303-fda8-000000000b45 46400 1727204544.54437: variable 'ansible_search_path' from source: unknown 46400 1727204544.54441: variable 'ansible_search_path' from source: unknown 46400 1727204544.54473: calling self._execute() 46400 1727204544.54548: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.54553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.54565: variable 'omit' from source: magic vars 46400 1727204544.54834: variable 'ansible_distribution_major_version' from source: facts 46400 1727204544.54845: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204544.54850: variable 'omit' from source: magic vars 46400 1727204544.54899: variable 'omit' from source: magic vars 46400 1727204544.54923: variable 'omit' from source: magic vars 46400 1727204544.54958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204544.54987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204544.55005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204544.55017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.55026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.55051: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204544.55054: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.55058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.55176: Set connection var ansible_shell_type to sh 46400 1727204544.55191: Set connection var ansible_shell_executable to /bin/sh 46400 1727204544.55202: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204544.55212: Set connection var ansible_connection to ssh 46400 1727204544.55227: Set connection var ansible_pipelining to False 46400 1727204544.55236: Set connection var ansible_timeout to 10 46400 1727204544.55271: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.55282: variable 'ansible_connection' from source: unknown 46400 1727204544.55289: variable 'ansible_module_compression' from source: unknown 46400 1727204544.55295: variable 'ansible_shell_type' from source: unknown 46400 1727204544.55300: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.55306: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.55313: variable 'ansible_pipelining' from source: unknown 46400 1727204544.55319: variable 'ansible_timeout' from source: unknown 46400 1727204544.55330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.55480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204544.55503: variable 'omit' from source: magic vars 46400 1727204544.55512: starting attempt loop 46400 1727204544.55517: running the handler 46400 1727204544.55665: variable '__network_connections_result' from source: set_fact 46400 1727204544.55729: handler run complete 46400 1727204544.55750: attempt loop complete, returning result 46400 1727204544.55758: _execute() done 46400 1727204544.55772: dumping result to json 46400 1727204544.55780: done dumping result, returning 46400 1727204544.55790: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000000b45] 46400 1727204544.55799: sending task result for task 0affcd87-79f5-1303-fda8-000000000b45 46400 1727204544.55913: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b45 46400 1727204544.55924: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39" ] } 46400 1727204544.56012: no more pending results, returning what we have 46400 1727204544.56018: results queue empty 46400 1727204544.56019: checking for any_errors_fatal 46400 1727204544.56069: done checking for any_errors_fatal 46400 1727204544.56071: checking for max_fail_percentage 46400 1727204544.56073: done checking for max_fail_percentage 46400 1727204544.56074: checking to see if all hosts have failed and the running result is not ok 46400 1727204544.56075: done checking to see if all hosts have failed 46400 1727204544.56076: getting the remaining hosts for this loop 46400 1727204544.56077: done getting the remaining hosts for this loop 46400 1727204544.56081: getting the next task for host managed-node2 46400 1727204544.56091: done getting next task for host managed-node2 46400 1727204544.56095: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204544.56100: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204544.56113: getting variables 46400 1727204544.56114: in VariableManager get_vars() 46400 1727204544.56151: Calling all_inventory to load vars for managed-node2 46400 1727204544.56153: Calling groups_inventory to load vars for managed-node2 46400 1727204544.56156: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204544.56178: Calling all_plugins_play to load vars for managed-node2 46400 1727204544.56182: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204544.56186: Calling groups_plugins_play to load vars for managed-node2 46400 1727204544.58025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204544.60686: done with get_vars() 46400 1727204544.60716: done getting variables 46400 1727204544.60789: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:24 -0400 (0:00:00.071) 0:00:34.892 ***** 46400 1727204544.60831: entering _queue_task() for managed-node2/debug 46400 1727204544.61189: worker is 1 (out of 1 available) 46400 1727204544.61202: exiting _queue_task() for managed-node2/debug 46400 1727204544.61215: done queuing things up, now waiting for results queue to drain 46400 1727204544.61216: waiting for pending results... 46400 1727204544.61597: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204544.61774: in run() - task 0affcd87-79f5-1303-fda8-000000000b46 46400 1727204544.61793: variable 'ansible_search_path' from source: unknown 46400 1727204544.61800: variable 'ansible_search_path' from source: unknown 46400 1727204544.61856: calling self._execute() 46400 1727204544.61977: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.61995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.62014: variable 'omit' from source: magic vars 46400 1727204544.62439: variable 'ansible_distribution_major_version' from source: facts 46400 1727204544.62455: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204544.62470: variable 'omit' from source: magic vars 46400 1727204544.62569: variable 'omit' from source: magic vars 46400 1727204544.62621: variable 'omit' from source: magic vars 46400 1727204544.62677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204544.62729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204544.62767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204544.62790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.62804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.62852: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204544.62869: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.62876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.62993: Set connection var ansible_shell_type to sh 46400 1727204544.63007: Set connection var ansible_shell_executable to /bin/sh 46400 1727204544.63016: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204544.63024: Set connection var ansible_connection to ssh 46400 1727204544.63043: Set connection var ansible_pipelining to False 46400 1727204544.63058: Set connection var ansible_timeout to 10 46400 1727204544.63097: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.63104: variable 'ansible_connection' from source: unknown 46400 1727204544.63110: variable 'ansible_module_compression' from source: unknown 46400 1727204544.63115: variable 'ansible_shell_type' from source: unknown 46400 1727204544.63121: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.63126: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.63136: variable 'ansible_pipelining' from source: unknown 46400 1727204544.63147: variable 'ansible_timeout' from source: unknown 46400 1727204544.63159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.63327: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204544.63343: variable 'omit' from source: magic vars 46400 1727204544.63352: starting attempt loop 46400 1727204544.63358: running the handler 46400 1727204544.63422: variable '__network_connections_result' from source: set_fact 46400 1727204544.63517: variable '__network_connections_result' from source: set_fact 46400 1727204544.63655: handler run complete 46400 1727204544.63700: attempt loop complete, returning result 46400 1727204544.63713: _execute() done 46400 1727204544.63720: dumping result to json 46400 1727204544.63731: done dumping result, returning 46400 1727204544.63742: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000000b46] 46400 1727204544.63756: sending task result for task 0affcd87-79f5-1303-fda8-000000000b46 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39" ] } } 46400 1727204544.63987: no more pending results, returning what we have 46400 1727204544.63992: results queue empty 46400 1727204544.63993: checking for any_errors_fatal 46400 1727204544.64005: done checking for any_errors_fatal 46400 1727204544.64006: checking for max_fail_percentage 46400 1727204544.64009: done checking for max_fail_percentage 46400 1727204544.64010: checking to see if all hosts have failed and the running result is not ok 46400 1727204544.64011: done checking to see if all hosts have failed 46400 1727204544.64011: getting the remaining hosts for this loop 46400 1727204544.64013: done getting the remaining hosts for this loop 46400 1727204544.64018: getting the next task for host managed-node2 46400 1727204544.64027: done getting next task for host managed-node2 46400 1727204544.64031: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204544.64036: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204544.64050: getting variables 46400 1727204544.64051: in VariableManager get_vars() 46400 1727204544.64091: Calling all_inventory to load vars for managed-node2 46400 1727204544.64094: Calling groups_inventory to load vars for managed-node2 46400 1727204544.64106: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204544.64117: Calling all_plugins_play to load vars for managed-node2 46400 1727204544.64120: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204544.64123: Calling groups_plugins_play to load vars for managed-node2 46400 1727204544.65177: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b46 46400 1727204544.65180: WORKER PROCESS EXITING 46400 1727204544.67223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204544.69228: done with get_vars() 46400 1727204544.69257: done getting variables 46400 1727204544.69319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:24 -0400 (0:00:00.085) 0:00:34.978 ***** 46400 1727204544.69360: entering _queue_task() for managed-node2/debug 46400 1727204544.69725: worker is 1 (out of 1 available) 46400 1727204544.69740: exiting _queue_task() for managed-node2/debug 46400 1727204544.69753: done queuing things up, now waiting for results queue to drain 46400 1727204544.69755: waiting for pending results... 46400 1727204544.70058: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204544.70215: in run() - task 0affcd87-79f5-1303-fda8-000000000b47 46400 1727204544.70236: variable 'ansible_search_path' from source: unknown 46400 1727204544.70277: variable 'ansible_search_path' from source: unknown 46400 1727204544.70374: calling self._execute() 46400 1727204544.70477: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.70497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.70517: variable 'omit' from source: magic vars 46400 1727204544.70944: variable 'ansible_distribution_major_version' from source: facts 46400 1727204544.70962: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204544.71102: variable 'network_state' from source: role '' defaults 46400 1727204544.71119: Evaluated conditional (network_state != {}): False 46400 1727204544.71126: when evaluation is False, skipping this task 46400 1727204544.71133: _execute() done 46400 1727204544.71140: dumping result to json 46400 1727204544.71147: done dumping result, returning 46400 1727204544.71157: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000000b47] 46400 1727204544.71170: sending task result for task 0affcd87-79f5-1303-fda8-000000000b47 46400 1727204544.71293: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b47 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204544.71344: no more pending results, returning what we have 46400 1727204544.71349: results queue empty 46400 1727204544.71350: checking for any_errors_fatal 46400 1727204544.71362: done checking for any_errors_fatal 46400 1727204544.71363: checking for max_fail_percentage 46400 1727204544.71366: done checking for max_fail_percentage 46400 1727204544.71367: checking to see if all hosts have failed and the running result is not ok 46400 1727204544.71368: done checking to see if all hosts have failed 46400 1727204544.71369: getting the remaining hosts for this loop 46400 1727204544.71371: done getting the remaining hosts for this loop 46400 1727204544.71375: getting the next task for host managed-node2 46400 1727204544.71386: done getting next task for host managed-node2 46400 1727204544.71391: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204544.71398: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204544.71422: getting variables 46400 1727204544.71424: in VariableManager get_vars() 46400 1727204544.71467: Calling all_inventory to load vars for managed-node2 46400 1727204544.71470: Calling groups_inventory to load vars for managed-node2 46400 1727204544.71473: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204544.71486: Calling all_plugins_play to load vars for managed-node2 46400 1727204544.71490: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204544.71493: Calling groups_plugins_play to load vars for managed-node2 46400 1727204544.72518: WORKER PROCESS EXITING 46400 1727204544.73275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204544.74988: done with get_vars() 46400 1727204544.75012: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:24 -0400 (0:00:00.057) 0:00:35.035 ***** 46400 1727204544.75118: entering _queue_task() for managed-node2/ping 46400 1727204544.75447: worker is 1 (out of 1 available) 46400 1727204544.75466: exiting _queue_task() for managed-node2/ping 46400 1727204544.75479: done queuing things up, now waiting for results queue to drain 46400 1727204544.75481: waiting for pending results... 46400 1727204544.75778: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204544.75941: in run() - task 0affcd87-79f5-1303-fda8-000000000b48 46400 1727204544.75962: variable 'ansible_search_path' from source: unknown 46400 1727204544.75973: variable 'ansible_search_path' from source: unknown 46400 1727204544.76018: calling self._execute() 46400 1727204544.76125: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.76142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.76157: variable 'omit' from source: magic vars 46400 1727204544.76550: variable 'ansible_distribution_major_version' from source: facts 46400 1727204544.76571: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204544.76582: variable 'omit' from source: magic vars 46400 1727204544.76648: variable 'omit' from source: magic vars 46400 1727204544.76696: variable 'omit' from source: magic vars 46400 1727204544.76743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204544.76792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204544.76816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204544.76834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.76849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204544.76883: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204544.76892: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.76900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.76991: Set connection var ansible_shell_type to sh 46400 1727204544.77010: Set connection var ansible_shell_executable to /bin/sh 46400 1727204544.77018: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204544.77025: Set connection var ansible_connection to ssh 46400 1727204544.77033: Set connection var ansible_pipelining to False 46400 1727204544.77040: Set connection var ansible_timeout to 10 46400 1727204544.77070: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.77078: variable 'ansible_connection' from source: unknown 46400 1727204544.77089: variable 'ansible_module_compression' from source: unknown 46400 1727204544.77096: variable 'ansible_shell_type' from source: unknown 46400 1727204544.77101: variable 'ansible_shell_executable' from source: unknown 46400 1727204544.77109: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204544.77118: variable 'ansible_pipelining' from source: unknown 46400 1727204544.77124: variable 'ansible_timeout' from source: unknown 46400 1727204544.77131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204544.77342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204544.77359: variable 'omit' from source: magic vars 46400 1727204544.77370: starting attempt loop 46400 1727204544.77376: running the handler 46400 1727204544.77393: _low_level_execute_command(): starting 46400 1727204544.77403: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204544.78209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.78224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.78239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.78256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.78306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.78321: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.78334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.78352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.78362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.78376: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.78388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.78405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.78420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.78435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.78447: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.78461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.78546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.78570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.78586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.78666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.80333: stdout chunk (state=3): >>>/root <<< 46400 1727204544.80516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.80520: stdout chunk (state=3): >>><<< 46400 1727204544.80523: stderr chunk (state=3): >>><<< 46400 1727204544.80650: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.80654: _low_level_execute_command(): starting 46400 1727204544.80657: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382 `" && echo ansible-tmp-1727204544.8054695-49225-40208092122382="` echo /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382 `" ) && sleep 0' 46400 1727204544.81287: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.81290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.81295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.81327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204544.81332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.81335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.81344: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.81418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.81425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.81427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.81472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.83319: stdout chunk (state=3): >>>ansible-tmp-1727204544.8054695-49225-40208092122382=/root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382 <<< 46400 1727204544.83433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.83509: stderr chunk (state=3): >>><<< 46400 1727204544.83512: stdout chunk (state=3): >>><<< 46400 1727204544.83670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204544.8054695-49225-40208092122382=/root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.83673: variable 'ansible_module_compression' from source: unknown 46400 1727204544.83676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204544.83678: variable 'ansible_facts' from source: unknown 46400 1727204544.83733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/AnsiballZ_ping.py 46400 1727204544.83885: Sending initial data 46400 1727204544.83888: Sent initial data (152 bytes) 46400 1727204544.84823: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.84837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.84853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.84873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.84916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.84929: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.84943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.84959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.84974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.84984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.84995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.85007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.85021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.85034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.85044: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.85056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.85134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.85156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.85174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.85241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.87015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204544.87057: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204544.87104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp0gxs8td9 /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/AnsiballZ_ping.py <<< 46400 1727204544.87414: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204544.88216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.88406: stderr chunk (state=3): >>><<< 46400 1727204544.88409: stdout chunk (state=3): >>><<< 46400 1727204544.88411: done transferring module to remote 46400 1727204544.88413: _low_level_execute_command(): starting 46400 1727204544.88415: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/ /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/AnsiballZ_ping.py && sleep 0' 46400 1727204544.89012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.89023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.89040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.89059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.89103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.89115: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.89127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.89145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.89157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.89179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.89194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.89208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.89224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.89238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.89251: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.89267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.89348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.89373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.89398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.89468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204544.91188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204544.91293: stderr chunk (state=3): >>><<< 46400 1727204544.91297: stdout chunk (state=3): >>><<< 46400 1727204544.91393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204544.91397: _low_level_execute_command(): starting 46400 1727204544.91400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/AnsiballZ_ping.py && sleep 0' 46400 1727204544.91999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204544.92013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.92026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.92044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.92094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.92105: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204544.92118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.92134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204544.92144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204544.92154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204544.92167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204544.92185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204544.92199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204544.92210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204544.92220: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204544.92231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204544.92314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204544.92334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204544.92349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204544.92427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.05566: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204545.06571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204545.06575: stdout chunk (state=3): >>><<< 46400 1727204545.06577: stderr chunk (state=3): >>><<< 46400 1727204545.06699: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204545.06704: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204545.06710: _low_level_execute_command(): starting 46400 1727204545.06712: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204544.8054695-49225-40208092122382/ > /dev/null 2>&1 && sleep 0' 46400 1727204545.07342: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204545.07356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204545.07378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.07396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.07436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204545.07448: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204545.07460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.07481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204545.07499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204545.07510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204545.07522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204545.07535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.07550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.07566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204545.07580: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204545.07600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.07678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204545.07699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204545.07726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.07797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.09575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204545.09642: stderr chunk (state=3): >>><<< 46400 1727204545.09666: stdout chunk (state=3): >>><<< 46400 1727204545.09689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204545.09695: handler run complete 46400 1727204545.09711: attempt loop complete, returning result 46400 1727204545.09714: _execute() done 46400 1727204545.09717: dumping result to json 46400 1727204545.09719: done dumping result, returning 46400 1727204545.09730: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000000b48] 46400 1727204545.09734: sending task result for task 0affcd87-79f5-1303-fda8-000000000b48 46400 1727204545.09838: done sending task result for task 0affcd87-79f5-1303-fda8-000000000b48 46400 1727204545.09841: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204545.09908: no more pending results, returning what we have 46400 1727204545.09912: results queue empty 46400 1727204545.09913: checking for any_errors_fatal 46400 1727204545.09920: done checking for any_errors_fatal 46400 1727204545.09921: checking for max_fail_percentage 46400 1727204545.09922: done checking for max_fail_percentage 46400 1727204545.09923: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.09924: done checking to see if all hosts have failed 46400 1727204545.09925: getting the remaining hosts for this loop 46400 1727204545.09927: done getting the remaining hosts for this loop 46400 1727204545.09930: getting the next task for host managed-node2 46400 1727204545.09942: done getting next task for host managed-node2 46400 1727204545.09944: ^ task is: TASK: meta (role_complete) 46400 1727204545.09950: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.09963: getting variables 46400 1727204545.09966: in VariableManager get_vars() 46400 1727204545.10003: Calling all_inventory to load vars for managed-node2 46400 1727204545.10005: Calling groups_inventory to load vars for managed-node2 46400 1727204545.10007: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.10016: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.10019: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.10021: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.15390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.17017: done with get_vars() 46400 1727204545.17042: done getting variables 46400 1727204545.17115: done queuing things up, now waiting for results queue to drain 46400 1727204545.17117: results queue empty 46400 1727204545.17118: checking for any_errors_fatal 46400 1727204545.17121: done checking for any_errors_fatal 46400 1727204545.17122: checking for max_fail_percentage 46400 1727204545.17123: done checking for max_fail_percentage 46400 1727204545.17124: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.17125: done checking to see if all hosts have failed 46400 1727204545.17126: getting the remaining hosts for this loop 46400 1727204545.17127: done getting the remaining hosts for this loop 46400 1727204545.17134: getting the next task for host managed-node2 46400 1727204545.17139: done getting next task for host managed-node2 46400 1727204545.17141: ^ task is: TASK: Show result 46400 1727204545.17144: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.17146: getting variables 46400 1727204545.17147: in VariableManager get_vars() 46400 1727204545.17157: Calling all_inventory to load vars for managed-node2 46400 1727204545.17159: Calling groups_inventory to load vars for managed-node2 46400 1727204545.17162: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.17168: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.17171: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.17173: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.18336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.19955: done with get_vars() 46400 1727204545.19977: done getting variables 46400 1727204545.20021: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.449) 0:00:35.485 ***** 46400 1727204545.20050: entering _queue_task() for managed-node2/debug 46400 1727204545.20392: worker is 1 (out of 1 available) 46400 1727204545.20405: exiting _queue_task() for managed-node2/debug 46400 1727204545.20419: done queuing things up, now waiting for results queue to drain 46400 1727204545.20422: waiting for pending results... 46400 1727204545.20709: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204545.20854: in run() - task 0affcd87-79f5-1303-fda8-000000000ad2 46400 1727204545.20879: variable 'ansible_search_path' from source: unknown 46400 1727204545.20886: variable 'ansible_search_path' from source: unknown 46400 1727204545.20923: calling self._execute() 46400 1727204545.21020: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.21033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.21048: variable 'omit' from source: magic vars 46400 1727204545.21536: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.21554: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.21568: variable 'omit' from source: magic vars 46400 1727204545.21621: variable 'omit' from source: magic vars 46400 1727204545.21659: variable 'omit' from source: magic vars 46400 1727204545.21707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204545.21750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204545.21780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204545.21802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204545.21817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204545.21855: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204545.21865: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.21874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.21974: Set connection var ansible_shell_type to sh 46400 1727204545.21990: Set connection var ansible_shell_executable to /bin/sh 46400 1727204545.22001: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204545.22009: Set connection var ansible_connection to ssh 46400 1727204545.22017: Set connection var ansible_pipelining to False 46400 1727204545.22026: Set connection var ansible_timeout to 10 46400 1727204545.22055: variable 'ansible_shell_executable' from source: unknown 46400 1727204545.22069: variable 'ansible_connection' from source: unknown 46400 1727204545.22077: variable 'ansible_module_compression' from source: unknown 46400 1727204545.22083: variable 'ansible_shell_type' from source: unknown 46400 1727204545.22089: variable 'ansible_shell_executable' from source: unknown 46400 1727204545.22095: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.22102: variable 'ansible_pipelining' from source: unknown 46400 1727204545.22108: variable 'ansible_timeout' from source: unknown 46400 1727204545.22115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.22258: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204545.22281: variable 'omit' from source: magic vars 46400 1727204545.22291: starting attempt loop 46400 1727204545.22298: running the handler 46400 1727204545.22346: variable '__network_connections_result' from source: set_fact 46400 1727204545.22432: variable '__network_connections_result' from source: set_fact 46400 1727204545.22554: handler run complete 46400 1727204545.22592: attempt loop complete, returning result 46400 1727204545.22602: _execute() done 46400 1727204545.22612: dumping result to json 46400 1727204545.22621: done dumping result, returning 46400 1727204545.22727: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-000000000ad2] 46400 1727204545.22738: sending task result for task 0affcd87-79f5-1303-fda8-000000000ad2 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39" ] } } 46400 1727204545.22954: no more pending results, returning what we have 46400 1727204545.22958: results queue empty 46400 1727204545.22959: checking for any_errors_fatal 46400 1727204545.22962: done checking for any_errors_fatal 46400 1727204545.22963: checking for max_fail_percentage 46400 1727204545.22967: done checking for max_fail_percentage 46400 1727204545.22968: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.22969: done checking to see if all hosts have failed 46400 1727204545.22970: getting the remaining hosts for this loop 46400 1727204545.22972: done getting the remaining hosts for this loop 46400 1727204545.22976: getting the next task for host managed-node2 46400 1727204545.22987: done getting next task for host managed-node2 46400 1727204545.22990: ^ task is: TASK: Test 46400 1727204545.22993: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.22998: getting variables 46400 1727204545.22999: in VariableManager get_vars() 46400 1727204545.23035: Calling all_inventory to load vars for managed-node2 46400 1727204545.23037: Calling groups_inventory to load vars for managed-node2 46400 1727204545.23041: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.23053: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.23055: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.23058: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.24403: done sending task result for task 0affcd87-79f5-1303-fda8-000000000ad2 46400 1727204545.24407: WORKER PROCESS EXITING 46400 1727204545.25635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.28932: done with get_vars() 46400 1727204545.28962: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.090) 0:00:35.575 ***** 46400 1727204545.29072: entering _queue_task() for managed-node2/include_tasks 46400 1727204545.29388: worker is 1 (out of 1 available) 46400 1727204545.29400: exiting _queue_task() for managed-node2/include_tasks 46400 1727204545.29413: done queuing things up, now waiting for results queue to drain 46400 1727204545.29415: waiting for pending results... 46400 1727204545.29822: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204545.29956: in run() - task 0affcd87-79f5-1303-fda8-000000000a4d 46400 1727204545.29983: variable 'ansible_search_path' from source: unknown 46400 1727204545.29991: variable 'ansible_search_path' from source: unknown 46400 1727204545.30040: variable 'lsr_test' from source: include params 46400 1727204545.30259: variable 'lsr_test' from source: include params 46400 1727204545.30331: variable 'omit' from source: magic vars 46400 1727204545.30485: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.30507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.30528: variable 'omit' from source: magic vars 46400 1727204545.30775: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.30790: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.30800: variable 'item' from source: unknown 46400 1727204545.30873: variable 'item' from source: unknown 46400 1727204545.30915: variable 'item' from source: unknown 46400 1727204545.30978: variable 'item' from source: unknown 46400 1727204545.31125: dumping result to json 46400 1727204545.31134: done dumping result, returning 46400 1727204545.31144: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-000000000a4d] 46400 1727204545.31155: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4d 46400 1727204545.31243: no more pending results, returning what we have 46400 1727204545.31248: in VariableManager get_vars() 46400 1727204545.31290: Calling all_inventory to load vars for managed-node2 46400 1727204545.31293: Calling groups_inventory to load vars for managed-node2 46400 1727204545.31298: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.31312: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.31316: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.31319: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.32383: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4d 46400 1727204545.32387: WORKER PROCESS EXITING 46400 1727204545.33002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.34996: done with get_vars() 46400 1727204545.35017: variable 'ansible_search_path' from source: unknown 46400 1727204545.35019: variable 'ansible_search_path' from source: unknown 46400 1727204545.35060: we have included files to process 46400 1727204545.35061: generating all_blocks data 46400 1727204545.35065: done generating all_blocks data 46400 1727204545.35071: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204545.35073: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204545.35075: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204545.35258: done processing included file 46400 1727204545.35260: iterating over new_blocks loaded from include file 46400 1727204545.35262: in VariableManager get_vars() 46400 1727204545.35280: done with get_vars() 46400 1727204545.35282: filtering new block on tags 46400 1727204545.35310: done filtering new block on tags 46400 1727204545.35312: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 46400 1727204545.35317: extending task lists for all hosts with included blocks 46400 1727204545.36705: done extending task lists 46400 1727204545.36707: done processing included files 46400 1727204545.36708: results queue empty 46400 1727204545.36708: checking for any_errors_fatal 46400 1727204545.36714: done checking for any_errors_fatal 46400 1727204545.36715: checking for max_fail_percentage 46400 1727204545.36716: done checking for max_fail_percentage 46400 1727204545.36717: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.36718: done checking to see if all hosts have failed 46400 1727204545.36719: getting the remaining hosts for this loop 46400 1727204545.36720: done getting the remaining hosts for this loop 46400 1727204545.36723: getting the next task for host managed-node2 46400 1727204545.36727: done getting next task for host managed-node2 46400 1727204545.36729: ^ task is: TASK: Include network role 46400 1727204545.36732: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.36734: getting variables 46400 1727204545.36735: in VariableManager get_vars() 46400 1727204545.36746: Calling all_inventory to load vars for managed-node2 46400 1727204545.36748: Calling groups_inventory to load vars for managed-node2 46400 1727204545.36751: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.36757: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.36759: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.36761: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.38875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.40489: done with get_vars() 46400 1727204545.40515: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.115) 0:00:35.690 ***** 46400 1727204545.40613: entering _queue_task() for managed-node2/include_role 46400 1727204545.41417: worker is 1 (out of 1 available) 46400 1727204545.41430: exiting _queue_task() for managed-node2/include_role 46400 1727204545.41444: done queuing things up, now waiting for results queue to drain 46400 1727204545.41445: waiting for pending results... 46400 1727204545.41739: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204545.41876: in run() - task 0affcd87-79f5-1303-fda8-000000000caa 46400 1727204545.41899: variable 'ansible_search_path' from source: unknown 46400 1727204545.41907: variable 'ansible_search_path' from source: unknown 46400 1727204545.41949: calling self._execute() 46400 1727204545.42056: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.42071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.42087: variable 'omit' from source: magic vars 46400 1727204545.42469: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.42487: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.42498: _execute() done 46400 1727204545.42507: dumping result to json 46400 1727204545.42514: done dumping result, returning 46400 1727204545.42523: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000000caa] 46400 1727204545.42531: sending task result for task 0affcd87-79f5-1303-fda8-000000000caa 46400 1727204545.42675: no more pending results, returning what we have 46400 1727204545.42680: in VariableManager get_vars() 46400 1727204545.42721: Calling all_inventory to load vars for managed-node2 46400 1727204545.42724: Calling groups_inventory to load vars for managed-node2 46400 1727204545.42728: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.42742: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.42745: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.42748: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.43858: done sending task result for task 0affcd87-79f5-1303-fda8-000000000caa 46400 1727204545.43861: WORKER PROCESS EXITING 46400 1727204545.44503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.46144: done with get_vars() 46400 1727204545.46167: variable 'ansible_search_path' from source: unknown 46400 1727204545.46168: variable 'ansible_search_path' from source: unknown 46400 1727204545.46294: variable 'omit' from source: magic vars 46400 1727204545.46336: variable 'omit' from source: magic vars 46400 1727204545.46349: variable 'omit' from source: magic vars 46400 1727204545.46352: we have included files to process 46400 1727204545.46353: generating all_blocks data 46400 1727204545.46354: done generating all_blocks data 46400 1727204545.46356: processing included file: fedora.linux_system_roles.network 46400 1727204545.46391: in VariableManager get_vars() 46400 1727204545.46406: done with get_vars() 46400 1727204545.46450: in VariableManager get_vars() 46400 1727204545.46487: done with get_vars() 46400 1727204545.46538: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204545.46700: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204545.46783: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204545.47246: in VariableManager get_vars() 46400 1727204545.47276: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204545.49980: iterating over new_blocks loaded from include file 46400 1727204545.49982: in VariableManager get_vars() 46400 1727204545.50000: done with get_vars() 46400 1727204545.50002: filtering new block on tags 46400 1727204545.50292: done filtering new block on tags 46400 1727204545.50295: in VariableManager get_vars() 46400 1727204545.50310: done with get_vars() 46400 1727204545.50312: filtering new block on tags 46400 1727204545.50328: done filtering new block on tags 46400 1727204545.50331: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204545.50336: extending task lists for all hosts with included blocks 46400 1727204545.50435: done extending task lists 46400 1727204545.50436: done processing included files 46400 1727204545.50437: results queue empty 46400 1727204545.50438: checking for any_errors_fatal 46400 1727204545.50441: done checking for any_errors_fatal 46400 1727204545.50442: checking for max_fail_percentage 46400 1727204545.50443: done checking for max_fail_percentage 46400 1727204545.50444: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.50444: done checking to see if all hosts have failed 46400 1727204545.50445: getting the remaining hosts for this loop 46400 1727204545.50446: done getting the remaining hosts for this loop 46400 1727204545.50448: getting the next task for host managed-node2 46400 1727204545.50452: done getting next task for host managed-node2 46400 1727204545.50454: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204545.50457: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.50467: getting variables 46400 1727204545.50468: in VariableManager get_vars() 46400 1727204545.50480: Calling all_inventory to load vars for managed-node2 46400 1727204545.50482: Calling groups_inventory to load vars for managed-node2 46400 1727204545.50484: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.50488: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.50490: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.50492: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.51866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.53703: done with get_vars() 46400 1727204545.53732: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.134) 0:00:35.825 ***** 46400 1727204545.54104: entering _queue_task() for managed-node2/include_tasks 46400 1727204545.54453: worker is 1 (out of 1 available) 46400 1727204545.54470: exiting _queue_task() for managed-node2/include_tasks 46400 1727204545.54483: done queuing things up, now waiting for results queue to drain 46400 1727204545.54485: waiting for pending results... 46400 1727204545.54771: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204545.54931: in run() - task 0affcd87-79f5-1303-fda8-000000000d16 46400 1727204545.54978: variable 'ansible_search_path' from source: unknown 46400 1727204545.54981: variable 'ansible_search_path' from source: unknown 46400 1727204545.54993: calling self._execute() 46400 1727204545.55072: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.55089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.55093: variable 'omit' from source: magic vars 46400 1727204545.55368: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.55378: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.55385: _execute() done 46400 1727204545.55388: dumping result to json 46400 1727204545.55391: done dumping result, returning 46400 1727204545.55400: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000000d16] 46400 1727204545.55402: sending task result for task 0affcd87-79f5-1303-fda8-000000000d16 46400 1727204545.55496: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d16 46400 1727204545.55499: WORKER PROCESS EXITING 46400 1727204545.55543: no more pending results, returning what we have 46400 1727204545.55548: in VariableManager get_vars() 46400 1727204545.55597: Calling all_inventory to load vars for managed-node2 46400 1727204545.55601: Calling groups_inventory to load vars for managed-node2 46400 1727204545.55603: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.55615: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.55618: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.55621: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.56431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.58050: done with get_vars() 46400 1727204545.58069: variable 'ansible_search_path' from source: unknown 46400 1727204545.58070: variable 'ansible_search_path' from source: unknown 46400 1727204545.58099: we have included files to process 46400 1727204545.58100: generating all_blocks data 46400 1727204545.58102: done generating all_blocks data 46400 1727204545.58104: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204545.58104: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204545.58106: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204545.58511: done processing included file 46400 1727204545.58512: iterating over new_blocks loaded from include file 46400 1727204545.58514: in VariableManager get_vars() 46400 1727204545.58530: done with get_vars() 46400 1727204545.58531: filtering new block on tags 46400 1727204545.58551: done filtering new block on tags 46400 1727204545.58553: in VariableManager get_vars() 46400 1727204545.58570: done with get_vars() 46400 1727204545.58571: filtering new block on tags 46400 1727204545.58600: done filtering new block on tags 46400 1727204545.58602: in VariableManager get_vars() 46400 1727204545.58616: done with get_vars() 46400 1727204545.58617: filtering new block on tags 46400 1727204545.58643: done filtering new block on tags 46400 1727204545.58644: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204545.58648: extending task lists for all hosts with included blocks 46400 1727204545.59686: done extending task lists 46400 1727204545.59687: done processing included files 46400 1727204545.59688: results queue empty 46400 1727204545.59688: checking for any_errors_fatal 46400 1727204545.59690: done checking for any_errors_fatal 46400 1727204545.59691: checking for max_fail_percentage 46400 1727204545.59692: done checking for max_fail_percentage 46400 1727204545.59692: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.59693: done checking to see if all hosts have failed 46400 1727204545.59693: getting the remaining hosts for this loop 46400 1727204545.59694: done getting the remaining hosts for this loop 46400 1727204545.59696: getting the next task for host managed-node2 46400 1727204545.59701: done getting next task for host managed-node2 46400 1727204545.59703: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204545.59705: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.59712: getting variables 46400 1727204545.59713: in VariableManager get_vars() 46400 1727204545.59725: Calling all_inventory to load vars for managed-node2 46400 1727204545.59730: Calling groups_inventory to load vars for managed-node2 46400 1727204545.59733: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.59743: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.59749: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.59753: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.60981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.62624: done with get_vars() 46400 1727204545.62652: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.086) 0:00:35.911 ***** 46400 1727204545.62741: entering _queue_task() for managed-node2/setup 46400 1727204545.63093: worker is 1 (out of 1 available) 46400 1727204545.63107: exiting _queue_task() for managed-node2/setup 46400 1727204545.63121: done queuing things up, now waiting for results queue to drain 46400 1727204545.63122: waiting for pending results... 46400 1727204545.63411: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204545.63565: in run() - task 0affcd87-79f5-1303-fda8-000000000d6d 46400 1727204545.63580: variable 'ansible_search_path' from source: unknown 46400 1727204545.63584: variable 'ansible_search_path' from source: unknown 46400 1727204545.63619: calling self._execute() 46400 1727204545.63715: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.63722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.63731: variable 'omit' from source: magic vars 46400 1727204545.64112: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.64130: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.64345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204545.66729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204545.66793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204545.66832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204545.66867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204545.66893: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204545.66973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204545.67002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204545.67028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204545.67071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204545.67085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204545.67128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204545.67148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204545.67179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204545.67218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204545.67232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204545.67399: variable '__network_required_facts' from source: role '' defaults 46400 1727204545.67408: variable 'ansible_facts' from source: unknown 46400 1727204545.68200: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204545.68205: when evaluation is False, skipping this task 46400 1727204545.68207: _execute() done 46400 1727204545.68209: dumping result to json 46400 1727204545.68211: done dumping result, returning 46400 1727204545.68218: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000000d6d] 46400 1727204545.68225: sending task result for task 0affcd87-79f5-1303-fda8-000000000d6d 46400 1727204545.68326: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d6d 46400 1727204545.68328: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204545.68389: no more pending results, returning what we have 46400 1727204545.68394: results queue empty 46400 1727204545.68396: checking for any_errors_fatal 46400 1727204545.68398: done checking for any_errors_fatal 46400 1727204545.68398: checking for max_fail_percentage 46400 1727204545.68400: done checking for max_fail_percentage 46400 1727204545.68401: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.68402: done checking to see if all hosts have failed 46400 1727204545.68402: getting the remaining hosts for this loop 46400 1727204545.68405: done getting the remaining hosts for this loop 46400 1727204545.68409: getting the next task for host managed-node2 46400 1727204545.68422: done getting next task for host managed-node2 46400 1727204545.68426: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204545.68432: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.68455: getting variables 46400 1727204545.68457: in VariableManager get_vars() 46400 1727204545.68500: Calling all_inventory to load vars for managed-node2 46400 1727204545.68503: Calling groups_inventory to load vars for managed-node2 46400 1727204545.68505: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.68516: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.68519: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.68528: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.70160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.71970: done with get_vars() 46400 1727204545.71996: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.093) 0:00:36.005 ***** 46400 1727204545.72106: entering _queue_task() for managed-node2/stat 46400 1727204545.72437: worker is 1 (out of 1 available) 46400 1727204545.72452: exiting _queue_task() for managed-node2/stat 46400 1727204545.72466: done queuing things up, now waiting for results queue to drain 46400 1727204545.72467: waiting for pending results... 46400 1727204545.72757: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204545.72888: in run() - task 0affcd87-79f5-1303-fda8-000000000d6f 46400 1727204545.72902: variable 'ansible_search_path' from source: unknown 46400 1727204545.72906: variable 'ansible_search_path' from source: unknown 46400 1727204545.72946: calling self._execute() 46400 1727204545.73048: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.73054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.73067: variable 'omit' from source: magic vars 46400 1727204545.73446: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.73464: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.73634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204545.73908: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204545.73950: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204545.73983: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204545.74020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204545.74101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204545.74131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204545.74156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204545.74182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204545.74268: variable '__network_is_ostree' from source: set_fact 46400 1727204545.74272: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204545.74275: when evaluation is False, skipping this task 46400 1727204545.74277: _execute() done 46400 1727204545.74282: dumping result to json 46400 1727204545.74285: done dumping result, returning 46400 1727204545.74292: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000000d6f] 46400 1727204545.74297: sending task result for task 0affcd87-79f5-1303-fda8-000000000d6f 46400 1727204545.74393: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d6f 46400 1727204545.74396: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204545.74481: no more pending results, returning what we have 46400 1727204545.74486: results queue empty 46400 1727204545.74487: checking for any_errors_fatal 46400 1727204545.74498: done checking for any_errors_fatal 46400 1727204545.74499: checking for max_fail_percentage 46400 1727204545.74501: done checking for max_fail_percentage 46400 1727204545.74502: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.74503: done checking to see if all hosts have failed 46400 1727204545.74504: getting the remaining hosts for this loop 46400 1727204545.74505: done getting the remaining hosts for this loop 46400 1727204545.74510: getting the next task for host managed-node2 46400 1727204545.74520: done getting next task for host managed-node2 46400 1727204545.74523: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204545.74529: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.74551: getting variables 46400 1727204545.74553: in VariableManager get_vars() 46400 1727204545.74594: Calling all_inventory to load vars for managed-node2 46400 1727204545.74597: Calling groups_inventory to load vars for managed-node2 46400 1727204545.74599: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.74610: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.74613: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.74616: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.76192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.77863: done with get_vars() 46400 1727204545.77888: done getting variables 46400 1727204545.77950: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.058) 0:00:36.064 ***** 46400 1727204545.77995: entering _queue_task() for managed-node2/set_fact 46400 1727204545.78311: worker is 1 (out of 1 available) 46400 1727204545.78322: exiting _queue_task() for managed-node2/set_fact 46400 1727204545.78334: done queuing things up, now waiting for results queue to drain 46400 1727204545.78336: waiting for pending results... 46400 1727204545.78620: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204545.78764: in run() - task 0affcd87-79f5-1303-fda8-000000000d70 46400 1727204545.78776: variable 'ansible_search_path' from source: unknown 46400 1727204545.78780: variable 'ansible_search_path' from source: unknown 46400 1727204545.78819: calling self._execute() 46400 1727204545.78910: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.78914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.78925: variable 'omit' from source: magic vars 46400 1727204545.79288: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.79299: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.79466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204545.79718: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204545.79766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204545.79794: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204545.79825: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204545.79913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204545.79935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204545.79963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204545.79989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204545.80075: variable '__network_is_ostree' from source: set_fact 46400 1727204545.80086: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204545.80090: when evaluation is False, skipping this task 46400 1727204545.80092: _execute() done 46400 1727204545.80095: dumping result to json 46400 1727204545.80097: done dumping result, returning 46400 1727204545.80104: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000000d70] 46400 1727204545.80110: sending task result for task 0affcd87-79f5-1303-fda8-000000000d70 46400 1727204545.80207: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d70 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204545.80253: no more pending results, returning what we have 46400 1727204545.80258: results queue empty 46400 1727204545.80259: checking for any_errors_fatal 46400 1727204545.80270: done checking for any_errors_fatal 46400 1727204545.80271: checking for max_fail_percentage 46400 1727204545.80273: done checking for max_fail_percentage 46400 1727204545.80274: checking to see if all hosts have failed and the running result is not ok 46400 1727204545.80275: done checking to see if all hosts have failed 46400 1727204545.80276: getting the remaining hosts for this loop 46400 1727204545.80278: done getting the remaining hosts for this loop 46400 1727204545.80282: getting the next task for host managed-node2 46400 1727204545.80294: done getting next task for host managed-node2 46400 1727204545.80298: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204545.80305: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204545.80316: WORKER PROCESS EXITING 46400 1727204545.80332: getting variables 46400 1727204545.80334: in VariableManager get_vars() 46400 1727204545.80374: Calling all_inventory to load vars for managed-node2 46400 1727204545.80376: Calling groups_inventory to load vars for managed-node2 46400 1727204545.80379: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204545.80390: Calling all_plugins_play to load vars for managed-node2 46400 1727204545.80392: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204545.80395: Calling groups_plugins_play to load vars for managed-node2 46400 1727204545.82201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204545.83328: done with get_vars() 46400 1727204545.83347: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:25 -0400 (0:00:00.054) 0:00:36.118 ***** 46400 1727204545.83427: entering _queue_task() for managed-node2/service_facts 46400 1727204545.83672: worker is 1 (out of 1 available) 46400 1727204545.83685: exiting _queue_task() for managed-node2/service_facts 46400 1727204545.83701: done queuing things up, now waiting for results queue to drain 46400 1727204545.83702: waiting for pending results... 46400 1727204545.83888: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204545.84000: in run() - task 0affcd87-79f5-1303-fda8-000000000d72 46400 1727204545.84010: variable 'ansible_search_path' from source: unknown 46400 1727204545.84014: variable 'ansible_search_path' from source: unknown 46400 1727204545.84048: calling self._execute() 46400 1727204545.84124: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.84129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.84140: variable 'omit' from source: magic vars 46400 1727204545.84430: variable 'ansible_distribution_major_version' from source: facts 46400 1727204545.84491: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204545.84536: variable 'omit' from source: magic vars 46400 1727204545.84545: variable 'omit' from source: magic vars 46400 1727204545.84595: variable 'omit' from source: magic vars 46400 1727204545.85179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204545.85184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204545.85187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204545.85189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204545.85192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204545.85194: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204545.85197: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.85199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.85202: Set connection var ansible_shell_type to sh 46400 1727204545.85204: Set connection var ansible_shell_executable to /bin/sh 46400 1727204545.85206: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204545.85208: Set connection var ansible_connection to ssh 46400 1727204545.85210: Set connection var ansible_pipelining to False 46400 1727204545.85213: Set connection var ansible_timeout to 10 46400 1727204545.85215: variable 'ansible_shell_executable' from source: unknown 46400 1727204545.85217: variable 'ansible_connection' from source: unknown 46400 1727204545.85219: variable 'ansible_module_compression' from source: unknown 46400 1727204545.85221: variable 'ansible_shell_type' from source: unknown 46400 1727204545.85222: variable 'ansible_shell_executable' from source: unknown 46400 1727204545.85224: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204545.85226: variable 'ansible_pipelining' from source: unknown 46400 1727204545.85228: variable 'ansible_timeout' from source: unknown 46400 1727204545.85230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204545.85233: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204545.85235: variable 'omit' from source: magic vars 46400 1727204545.85238: starting attempt loop 46400 1727204545.85241: running the handler 46400 1727204545.85243: _low_level_execute_command(): starting 46400 1727204545.85245: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204545.85923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.85927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.85939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204545.85951: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204545.85961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.85994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204545.85998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.86059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204545.86062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204545.86069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.86142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.87768: stdout chunk (state=3): >>>/root <<< 46400 1727204545.87877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204545.87926: stderr chunk (state=3): >>><<< 46400 1727204545.87929: stdout chunk (state=3): >>><<< 46400 1727204545.87949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204545.87967: _low_level_execute_command(): starting 46400 1727204545.87972: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601 `" && echo ansible-tmp-1727204545.8794904-49264-925547098601="` echo /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601 `" ) && sleep 0' 46400 1727204545.88403: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.88407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.88439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.88450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204545.88453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204545.88460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.88481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.88534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204545.88542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.88606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.90438: stdout chunk (state=3): >>>ansible-tmp-1727204545.8794904-49264-925547098601=/root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601 <<< 46400 1727204545.90555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204545.90606: stderr chunk (state=3): >>><<< 46400 1727204545.90610: stdout chunk (state=3): >>><<< 46400 1727204545.90624: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204545.8794904-49264-925547098601=/root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204545.90662: variable 'ansible_module_compression' from source: unknown 46400 1727204545.90703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204545.90730: variable 'ansible_facts' from source: unknown 46400 1727204545.90789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/AnsiballZ_service_facts.py 46400 1727204545.90892: Sending initial data 46400 1727204545.90896: Sent initial data (159 bytes) 46400 1727204545.91559: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.91567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.91593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.91606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.91665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204545.91678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.91718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.93397: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204545.93404: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204545.93410: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 46400 1727204545.93415: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 46400 1727204545.93420: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 46400 1727204545.93427: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204545.93465: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 46400 1727204545.93485: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 46400 1727204545.93492: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 46400 1727204545.93524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpdn8fviux /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/AnsiballZ_service_facts.py <<< 46400 1727204545.93554: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204545.94380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204545.94490: stderr chunk (state=3): >>><<< 46400 1727204545.94498: stdout chunk (state=3): >>><<< 46400 1727204545.94516: done transferring module to remote 46400 1727204545.94526: _low_level_execute_command(): starting 46400 1727204545.94532: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/ /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/AnsiballZ_service_facts.py && sleep 0' 46400 1727204545.95002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204545.95017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.95033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204545.95047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.95056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.95107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204545.95125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.95161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204545.96858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204545.96910: stderr chunk (state=3): >>><<< 46400 1727204545.96914: stdout chunk (state=3): >>><<< 46400 1727204545.96928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204545.96932: _low_level_execute_command(): starting 46400 1727204545.96934: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/AnsiballZ_service_facts.py && sleep 0' 46400 1727204545.97383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204545.97388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204545.97420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.97432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204545.97488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204545.97500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204545.97549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.26272: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204547.26338: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hiber<<< 46400 1727204547.26362: stdout chunk (state=3): >>>nate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204547.27712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204547.27716: stdout chunk (state=3): >>><<< 46400 1727204547.27718: stderr chunk (state=3): >>><<< 46400 1727204547.27977: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204547.28469: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204547.28487: _low_level_execute_command(): starting 46400 1727204547.28496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204545.8794904-49264-925547098601/ > /dev/null 2>&1 && sleep 0' 46400 1727204547.29219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204547.29235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.29251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.29273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.29327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.29340: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204547.29355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.29376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204547.29388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204547.29410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204547.29423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.29436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.29456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.29473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.29484: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204547.29497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.29586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204547.29609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204547.29633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.29709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.31599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204547.31699: stderr chunk (state=3): >>><<< 46400 1727204547.31711: stdout chunk (state=3): >>><<< 46400 1727204547.31771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204547.31774: handler run complete 46400 1727204547.31975: variable 'ansible_facts' from source: unknown 46400 1727204547.32118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204547.32982: variable 'ansible_facts' from source: unknown 46400 1727204547.33308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204547.33758: attempt loop complete, returning result 46400 1727204547.33773: _execute() done 46400 1727204547.33781: dumping result to json 46400 1727204547.33861: done dumping result, returning 46400 1727204547.33951: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000000d72] 46400 1727204547.33962: sending task result for task 0affcd87-79f5-1303-fda8-000000000d72 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204547.35226: no more pending results, returning what we have 46400 1727204547.35230: results queue empty 46400 1727204547.35231: checking for any_errors_fatal 46400 1727204547.35239: done checking for any_errors_fatal 46400 1727204547.35240: checking for max_fail_percentage 46400 1727204547.35242: done checking for max_fail_percentage 46400 1727204547.35243: checking to see if all hosts have failed and the running result is not ok 46400 1727204547.35244: done checking to see if all hosts have failed 46400 1727204547.35245: getting the remaining hosts for this loop 46400 1727204547.35247: done getting the remaining hosts for this loop 46400 1727204547.35251: getting the next task for host managed-node2 46400 1727204547.35260: done getting next task for host managed-node2 46400 1727204547.35266: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204547.35272: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204547.35284: getting variables 46400 1727204547.35285: in VariableManager get_vars() 46400 1727204547.35325: Calling all_inventory to load vars for managed-node2 46400 1727204547.35328: Calling groups_inventory to load vars for managed-node2 46400 1727204547.35330: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204547.35342: Calling all_plugins_play to load vars for managed-node2 46400 1727204547.35345: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204547.35348: Calling groups_plugins_play to load vars for managed-node2 46400 1727204547.36576: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d72 46400 1727204547.36580: WORKER PROCESS EXITING 46400 1727204547.38189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204547.40101: done with get_vars() 46400 1727204547.40135: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:27 -0400 (0:00:01.569) 0:00:37.688 ***** 46400 1727204547.40367: entering _queue_task() for managed-node2/package_facts 46400 1727204547.40750: worker is 1 (out of 1 available) 46400 1727204547.40767: exiting _queue_task() for managed-node2/package_facts 46400 1727204547.40781: done queuing things up, now waiting for results queue to drain 46400 1727204547.40783: waiting for pending results... 46400 1727204547.41098: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204547.41273: in run() - task 0affcd87-79f5-1303-fda8-000000000d73 46400 1727204547.41303: variable 'ansible_search_path' from source: unknown 46400 1727204547.41312: variable 'ansible_search_path' from source: unknown 46400 1727204547.41356: calling self._execute() 46400 1727204547.41469: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204547.41483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204547.41506: variable 'omit' from source: magic vars 46400 1727204547.41915: variable 'ansible_distribution_major_version' from source: facts 46400 1727204547.41939: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204547.41951: variable 'omit' from source: magic vars 46400 1727204547.42036: variable 'omit' from source: magic vars 46400 1727204547.42083: variable 'omit' from source: magic vars 46400 1727204547.42134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204547.42184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204547.42215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204547.42237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204547.42253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204547.42296: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204547.42306: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204547.42319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204547.42430: Set connection var ansible_shell_type to sh 46400 1727204547.42448: Set connection var ansible_shell_executable to /bin/sh 46400 1727204547.42459: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204547.42472: Set connection var ansible_connection to ssh 46400 1727204547.42491: Set connection var ansible_pipelining to False 46400 1727204547.42502: Set connection var ansible_timeout to 10 46400 1727204547.42536: variable 'ansible_shell_executable' from source: unknown 46400 1727204547.42544: variable 'ansible_connection' from source: unknown 46400 1727204547.42551: variable 'ansible_module_compression' from source: unknown 46400 1727204547.42557: variable 'ansible_shell_type' from source: unknown 46400 1727204547.42565: variable 'ansible_shell_executable' from source: unknown 46400 1727204547.42573: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204547.42580: variable 'ansible_pipelining' from source: unknown 46400 1727204547.42594: variable 'ansible_timeout' from source: unknown 46400 1727204547.42602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204547.42823: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204547.42839: variable 'omit' from source: magic vars 46400 1727204547.42848: starting attempt loop 46400 1727204547.42858: running the handler 46400 1727204547.42879: _low_level_execute_command(): starting 46400 1727204547.42891: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204547.43721: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204547.43742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.43757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.43777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.43829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.43845: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204547.43858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.43878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204547.43888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204547.43903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204547.43916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.43929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.43943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.43958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.43972: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204547.43987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.44071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204547.44096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204547.44112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.44200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.45876: stdout chunk (state=3): >>>/root <<< 46400 1727204547.45975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204547.46079: stderr chunk (state=3): >>><<< 46400 1727204547.46091: stdout chunk (state=3): >>><<< 46400 1727204547.46238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204547.46241: _low_level_execute_command(): starting 46400 1727204547.46244: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501 `" && echo ansible-tmp-1727204547.4612765-49292-45449142627501="` echo /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501 `" ) && sleep 0' 46400 1727204547.46879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204547.46903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.46918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.46936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.46981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.47002: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204547.47019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.47039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204547.47051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204547.47062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204547.47077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.47090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.47111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.47127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.47138: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204547.47152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.47239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204547.47263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204547.47283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.47360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.49265: stdout chunk (state=3): >>>ansible-tmp-1727204547.4612765-49292-45449142627501=/root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501 <<< 46400 1727204547.49379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204547.49479: stderr chunk (state=3): >>><<< 46400 1727204547.49491: stdout chunk (state=3): >>><<< 46400 1727204547.49572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204547.4612765-49292-45449142627501=/root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204547.49775: variable 'ansible_module_compression' from source: unknown 46400 1727204547.49778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204547.49781: variable 'ansible_facts' from source: unknown 46400 1727204547.49918: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/AnsiballZ_package_facts.py 46400 1727204547.50098: Sending initial data 46400 1727204547.50101: Sent initial data (161 bytes) 46400 1727204547.51144: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204547.51160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.51180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.51207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.51252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.51267: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204547.51283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.51302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204547.51323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204547.51337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204547.51350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.51367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.51385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.51398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.51411: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204547.51434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.51512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204547.51544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204547.51563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.51637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.53420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204547.53452: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204547.53493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpbal4giir /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/AnsiballZ_package_facts.py <<< 46400 1727204547.53524: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204547.55830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204547.55905: stderr chunk (state=3): >>><<< 46400 1727204547.55909: stdout chunk (state=3): >>><<< 46400 1727204547.55926: done transferring module to remote 46400 1727204547.55936: _low_level_execute_command(): starting 46400 1727204547.55941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/ /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/AnsiballZ_package_facts.py && sleep 0' 46400 1727204547.56395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.56401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.56447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.56451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.56453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.56508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204547.56513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.56558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204547.58474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204547.58484: stdout chunk (state=3): >>><<< 46400 1727204547.58494: stderr chunk (state=3): >>><<< 46400 1727204547.58519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204547.58527: _low_level_execute_command(): starting 46400 1727204547.58536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/AnsiballZ_package_facts.py && sleep 0' 46400 1727204547.59169: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.59173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.59213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204547.59225: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204547.59238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.59255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204547.59276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204547.59288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204547.59301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204547.59316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204547.59342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204547.59345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204547.59406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204547.59417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204547.59475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204548.06712: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204548.06751: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204548.06757: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 46400 1727204548.06763: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204548.06769: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 46400 1727204548.06774: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204548.06802: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204548.06842: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204548.06851: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204548.06860: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204548.06884: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204548.06902: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204548.08457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204548.08544: stderr chunk (state=3): >>><<< 46400 1727204548.08547: stdout chunk (state=3): >>><<< 46400 1727204548.08784: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204548.10178: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204548.10230: _low_level_execute_command(): starting 46400 1727204548.10233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204547.4612765-49292-45449142627501/ > /dev/null 2>&1 && sleep 0' 46400 1727204548.10853: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204548.10861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204548.10878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204548.10891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204548.10928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204548.10935: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204548.10944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204548.10957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204548.10970: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204548.10977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204548.10984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204548.10993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204548.11005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204548.11012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204548.11018: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204548.11027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204548.11102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204548.11120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204548.11131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204548.11194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204548.13105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204548.13109: stdout chunk (state=3): >>><<< 46400 1727204548.13114: stderr chunk (state=3): >>><<< 46400 1727204548.13133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204548.13139: handler run complete 46400 1727204548.13843: variable 'ansible_facts' from source: unknown 46400 1727204548.14150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.15371: variable 'ansible_facts' from source: unknown 46400 1727204548.15987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.16865: attempt loop complete, returning result 46400 1727204548.16886: _execute() done 46400 1727204548.16894: dumping result to json 46400 1727204548.17140: done dumping result, returning 46400 1727204548.17154: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000000d73] 46400 1727204548.17175: sending task result for task 0affcd87-79f5-1303-fda8-000000000d73 46400 1727204548.19424: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d73 46400 1727204548.19427: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204548.19641: no more pending results, returning what we have 46400 1727204548.19645: results queue empty 46400 1727204548.19646: checking for any_errors_fatal 46400 1727204548.19652: done checking for any_errors_fatal 46400 1727204548.19653: checking for max_fail_percentage 46400 1727204548.19655: done checking for max_fail_percentage 46400 1727204548.19655: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.19656: done checking to see if all hosts have failed 46400 1727204548.19657: getting the remaining hosts for this loop 46400 1727204548.19659: done getting the remaining hosts for this loop 46400 1727204548.19668: getting the next task for host managed-node2 46400 1727204548.19677: done getting next task for host managed-node2 46400 1727204548.19682: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204548.19687: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.19700: getting variables 46400 1727204548.19702: in VariableManager get_vars() 46400 1727204548.19735: Calling all_inventory to load vars for managed-node2 46400 1727204548.19738: Calling groups_inventory to load vars for managed-node2 46400 1727204548.19745: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.19756: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.19762: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.19767: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.21342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.23131: done with get_vars() 46400 1727204548.23173: done getting variables 46400 1727204548.23241: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.829) 0:00:38.517 ***** 46400 1727204548.23281: entering _queue_task() for managed-node2/debug 46400 1727204548.23637: worker is 1 (out of 1 available) 46400 1727204548.23650: exiting _queue_task() for managed-node2/debug 46400 1727204548.23672: done queuing things up, now waiting for results queue to drain 46400 1727204548.23675: waiting for pending results... 46400 1727204548.23973: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204548.24136: in run() - task 0affcd87-79f5-1303-fda8-000000000d17 46400 1727204548.24158: variable 'ansible_search_path' from source: unknown 46400 1727204548.24170: variable 'ansible_search_path' from source: unknown 46400 1727204548.24213: calling self._execute() 46400 1727204548.24316: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.24335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.24351: variable 'omit' from source: magic vars 46400 1727204548.24766: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.24784: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.24795: variable 'omit' from source: magic vars 46400 1727204548.24876: variable 'omit' from source: magic vars 46400 1727204548.24987: variable 'network_provider' from source: set_fact 46400 1727204548.25009: variable 'omit' from source: magic vars 46400 1727204548.25057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204548.25108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204548.25134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204548.25155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204548.25175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204548.25215: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204548.25223: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.25230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.25340: Set connection var ansible_shell_type to sh 46400 1727204548.25355: Set connection var ansible_shell_executable to /bin/sh 46400 1727204548.25369: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204548.25379: Set connection var ansible_connection to ssh 46400 1727204548.25388: Set connection var ansible_pipelining to False 46400 1727204548.25396: Set connection var ansible_timeout to 10 46400 1727204548.25433: variable 'ansible_shell_executable' from source: unknown 46400 1727204548.25440: variable 'ansible_connection' from source: unknown 46400 1727204548.25446: variable 'ansible_module_compression' from source: unknown 46400 1727204548.25452: variable 'ansible_shell_type' from source: unknown 46400 1727204548.25458: variable 'ansible_shell_executable' from source: unknown 46400 1727204548.25469: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.25476: variable 'ansible_pipelining' from source: unknown 46400 1727204548.25482: variable 'ansible_timeout' from source: unknown 46400 1727204548.25489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.25647: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204548.25668: variable 'omit' from source: magic vars 46400 1727204548.25678: starting attempt loop 46400 1727204548.25684: running the handler 46400 1727204548.25734: handler run complete 46400 1727204548.25756: attempt loop complete, returning result 46400 1727204548.25767: _execute() done 46400 1727204548.25775: dumping result to json 46400 1727204548.25781: done dumping result, returning 46400 1727204548.25791: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000000d17] 46400 1727204548.25800: sending task result for task 0affcd87-79f5-1303-fda8-000000000d17 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204548.25971: no more pending results, returning what we have 46400 1727204548.25975: results queue empty 46400 1727204548.25976: checking for any_errors_fatal 46400 1727204548.25987: done checking for any_errors_fatal 46400 1727204548.25988: checking for max_fail_percentage 46400 1727204548.25990: done checking for max_fail_percentage 46400 1727204548.25991: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.25992: done checking to see if all hosts have failed 46400 1727204548.25993: getting the remaining hosts for this loop 46400 1727204548.25995: done getting the remaining hosts for this loop 46400 1727204548.25999: getting the next task for host managed-node2 46400 1727204548.26010: done getting next task for host managed-node2 46400 1727204548.26015: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204548.26020: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.26032: getting variables 46400 1727204548.26034: in VariableManager get_vars() 46400 1727204548.26080: Calling all_inventory to load vars for managed-node2 46400 1727204548.26084: Calling groups_inventory to load vars for managed-node2 46400 1727204548.26087: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.26099: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.26102: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.26105: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.27107: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d17 46400 1727204548.27112: WORKER PROCESS EXITING 46400 1727204548.27945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.29757: done with get_vars() 46400 1727204548.29797: done getting variables 46400 1727204548.29868: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.066) 0:00:38.583 ***** 46400 1727204548.29911: entering _queue_task() for managed-node2/fail 46400 1727204548.30283: worker is 1 (out of 1 available) 46400 1727204548.30297: exiting _queue_task() for managed-node2/fail 46400 1727204548.30311: done queuing things up, now waiting for results queue to drain 46400 1727204548.30312: waiting for pending results... 46400 1727204548.30642: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204548.30811: in run() - task 0affcd87-79f5-1303-fda8-000000000d18 46400 1727204548.30836: variable 'ansible_search_path' from source: unknown 46400 1727204548.30847: variable 'ansible_search_path' from source: unknown 46400 1727204548.30897: calling self._execute() 46400 1727204548.31017: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.31030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.31050: variable 'omit' from source: magic vars 46400 1727204548.31470: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.31492: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.31636: variable 'network_state' from source: role '' defaults 46400 1727204548.31653: Evaluated conditional (network_state != {}): False 46400 1727204548.31666: when evaluation is False, skipping this task 46400 1727204548.31675: _execute() done 46400 1727204548.31683: dumping result to json 46400 1727204548.31690: done dumping result, returning 46400 1727204548.31705: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000000d18] 46400 1727204548.31717: sending task result for task 0affcd87-79f5-1303-fda8-000000000d18 46400 1727204548.31842: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d18 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204548.31899: no more pending results, returning what we have 46400 1727204548.31904: results queue empty 46400 1727204548.31905: checking for any_errors_fatal 46400 1727204548.31914: done checking for any_errors_fatal 46400 1727204548.31915: checking for max_fail_percentage 46400 1727204548.31917: done checking for max_fail_percentage 46400 1727204548.31918: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.31919: done checking to see if all hosts have failed 46400 1727204548.31919: getting the remaining hosts for this loop 46400 1727204548.31921: done getting the remaining hosts for this loop 46400 1727204548.31926: getting the next task for host managed-node2 46400 1727204548.31940: done getting next task for host managed-node2 46400 1727204548.31945: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204548.31950: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.31977: getting variables 46400 1727204548.31979: in VariableManager get_vars() 46400 1727204548.32020: Calling all_inventory to load vars for managed-node2 46400 1727204548.32024: Calling groups_inventory to load vars for managed-node2 46400 1727204548.32026: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.32041: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.32045: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.32048: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.33006: WORKER PROCESS EXITING 46400 1727204548.39686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.41567: done with get_vars() 46400 1727204548.41600: done getting variables 46400 1727204548.41668: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.117) 0:00:38.701 ***** 46400 1727204548.41702: entering _queue_task() for managed-node2/fail 46400 1727204548.42211: worker is 1 (out of 1 available) 46400 1727204548.42224: exiting _queue_task() for managed-node2/fail 46400 1727204548.42237: done queuing things up, now waiting for results queue to drain 46400 1727204548.42240: waiting for pending results... 46400 1727204548.42558: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204548.42800: in run() - task 0affcd87-79f5-1303-fda8-000000000d19 46400 1727204548.42824: variable 'ansible_search_path' from source: unknown 46400 1727204548.42834: variable 'ansible_search_path' from source: unknown 46400 1727204548.42887: calling self._execute() 46400 1727204548.42999: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.43014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.43037: variable 'omit' from source: magic vars 46400 1727204548.43626: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.43645: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.43798: variable 'network_state' from source: role '' defaults 46400 1727204548.43815: Evaluated conditional (network_state != {}): False 46400 1727204548.43828: when evaluation is False, skipping this task 46400 1727204548.43837: _execute() done 46400 1727204548.43845: dumping result to json 46400 1727204548.43852: done dumping result, returning 46400 1727204548.43867: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000000d19] 46400 1727204548.43880: sending task result for task 0affcd87-79f5-1303-fda8-000000000d19 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204548.44052: no more pending results, returning what we have 46400 1727204548.44057: results queue empty 46400 1727204548.44058: checking for any_errors_fatal 46400 1727204548.44075: done checking for any_errors_fatal 46400 1727204548.44076: checking for max_fail_percentage 46400 1727204548.44079: done checking for max_fail_percentage 46400 1727204548.44080: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.44080: done checking to see if all hosts have failed 46400 1727204548.44081: getting the remaining hosts for this loop 46400 1727204548.44083: done getting the remaining hosts for this loop 46400 1727204548.44088: getting the next task for host managed-node2 46400 1727204548.44099: done getting next task for host managed-node2 46400 1727204548.44105: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204548.44110: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.44134: getting variables 46400 1727204548.44136: in VariableManager get_vars() 46400 1727204548.44183: Calling all_inventory to load vars for managed-node2 46400 1727204548.44186: Calling groups_inventory to load vars for managed-node2 46400 1727204548.44189: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.44202: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.44206: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.44209: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.45336: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d19 46400 1727204548.45340: WORKER PROCESS EXITING 46400 1727204548.46146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.47989: done with get_vars() 46400 1727204548.48019: done getting variables 46400 1727204548.48093: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.064) 0:00:38.765 ***** 46400 1727204548.48130: entering _queue_task() for managed-node2/fail 46400 1727204548.48503: worker is 1 (out of 1 available) 46400 1727204548.48521: exiting _queue_task() for managed-node2/fail 46400 1727204548.48534: done queuing things up, now waiting for results queue to drain 46400 1727204548.48535: waiting for pending results... 46400 1727204548.48855: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204548.49042: in run() - task 0affcd87-79f5-1303-fda8-000000000d1a 46400 1727204548.49070: variable 'ansible_search_path' from source: unknown 46400 1727204548.49086: variable 'ansible_search_path' from source: unknown 46400 1727204548.49131: calling self._execute() 46400 1727204548.49245: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.49259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.49282: variable 'omit' from source: magic vars 46400 1727204548.49717: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.49742: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.49938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204548.52997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204548.53081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204548.53141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204548.53187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204548.53219: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204548.53308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.53348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.53384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.53427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.53451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.53569: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.53589: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204548.53596: when evaluation is False, skipping this task 46400 1727204548.53603: _execute() done 46400 1727204548.53610: dumping result to json 46400 1727204548.53616: done dumping result, returning 46400 1727204548.53626: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000000d1a] 46400 1727204548.53636: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1a 46400 1727204548.53754: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1a 46400 1727204548.53766: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204548.53814: no more pending results, returning what we have 46400 1727204548.53819: results queue empty 46400 1727204548.53820: checking for any_errors_fatal 46400 1727204548.53825: done checking for any_errors_fatal 46400 1727204548.53826: checking for max_fail_percentage 46400 1727204548.53828: done checking for max_fail_percentage 46400 1727204548.53829: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.53830: done checking to see if all hosts have failed 46400 1727204548.53830: getting the remaining hosts for this loop 46400 1727204548.53832: done getting the remaining hosts for this loop 46400 1727204548.53837: getting the next task for host managed-node2 46400 1727204548.53846: done getting next task for host managed-node2 46400 1727204548.53850: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204548.53855: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.53880: getting variables 46400 1727204548.53883: in VariableManager get_vars() 46400 1727204548.53923: Calling all_inventory to load vars for managed-node2 46400 1727204548.53926: Calling groups_inventory to load vars for managed-node2 46400 1727204548.53928: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.53939: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.53942: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.53945: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.55968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.57705: done with get_vars() 46400 1727204548.57733: done getting variables 46400 1727204548.57806: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.097) 0:00:38.862 ***** 46400 1727204548.57845: entering _queue_task() for managed-node2/dnf 46400 1727204548.58219: worker is 1 (out of 1 available) 46400 1727204548.58232: exiting _queue_task() for managed-node2/dnf 46400 1727204548.58245: done queuing things up, now waiting for results queue to drain 46400 1727204548.58246: waiting for pending results... 46400 1727204548.58569: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204548.58745: in run() - task 0affcd87-79f5-1303-fda8-000000000d1b 46400 1727204548.58771: variable 'ansible_search_path' from source: unknown 46400 1727204548.58781: variable 'ansible_search_path' from source: unknown 46400 1727204548.58827: calling self._execute() 46400 1727204548.58937: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.58949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.58973: variable 'omit' from source: magic vars 46400 1727204548.59387: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.59409: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.59639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204548.62295: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204548.62392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204548.62436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204548.62483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204548.62519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204548.62611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.62646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.62687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.62736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.62755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.62901: variable 'ansible_distribution' from source: facts 46400 1727204548.62910: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.62934: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204548.63070: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204548.63223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.63256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.63292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.63342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.63371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.63419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.63451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.63490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.63539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.63558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.63609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.63636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.63677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.63724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.63743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.63931: variable 'network_connections' from source: include params 46400 1727204548.63949: variable 'interface' from source: play vars 46400 1727204548.64026: variable 'interface' from source: play vars 46400 1727204548.64113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204548.64326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204548.64377: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204548.64417: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204548.64456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204548.64510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204548.64542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204548.64592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.64624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204548.64687: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204548.64965: variable 'network_connections' from source: include params 46400 1727204548.64977: variable 'interface' from source: play vars 46400 1727204548.65047: variable 'interface' from source: play vars 46400 1727204548.65087: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204548.65096: when evaluation is False, skipping this task 46400 1727204548.65107: _execute() done 46400 1727204548.65115: dumping result to json 46400 1727204548.65122: done dumping result, returning 46400 1727204548.65134: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000d1b] 46400 1727204548.65145: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204548.65317: no more pending results, returning what we have 46400 1727204548.65322: results queue empty 46400 1727204548.65323: checking for any_errors_fatal 46400 1727204548.65333: done checking for any_errors_fatal 46400 1727204548.65334: checking for max_fail_percentage 46400 1727204548.65337: done checking for max_fail_percentage 46400 1727204548.65337: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.65338: done checking to see if all hosts have failed 46400 1727204548.65339: getting the remaining hosts for this loop 46400 1727204548.65341: done getting the remaining hosts for this loop 46400 1727204548.65346: getting the next task for host managed-node2 46400 1727204548.65355: done getting next task for host managed-node2 46400 1727204548.65363: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204548.65370: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.65392: getting variables 46400 1727204548.65394: in VariableManager get_vars() 46400 1727204548.65435: Calling all_inventory to load vars for managed-node2 46400 1727204548.65438: Calling groups_inventory to load vars for managed-node2 46400 1727204548.65441: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.65452: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.65455: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.65458: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.66537: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1b 46400 1727204548.66541: WORKER PROCESS EXITING 46400 1727204548.67347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.69288: done with get_vars() 46400 1727204548.69311: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204548.69403: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.115) 0:00:38.978 ***** 46400 1727204548.69438: entering _queue_task() for managed-node2/yum 46400 1727204548.69817: worker is 1 (out of 1 available) 46400 1727204548.69830: exiting _queue_task() for managed-node2/yum 46400 1727204548.69843: done queuing things up, now waiting for results queue to drain 46400 1727204548.69845: waiting for pending results... 46400 1727204548.70158: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204548.70306: in run() - task 0affcd87-79f5-1303-fda8-000000000d1c 46400 1727204548.70325: variable 'ansible_search_path' from source: unknown 46400 1727204548.70332: variable 'ansible_search_path' from source: unknown 46400 1727204548.70380: calling self._execute() 46400 1727204548.70488: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.70501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.70516: variable 'omit' from source: magic vars 46400 1727204548.70929: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.70950: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.71146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204548.73712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204548.73808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204548.73855: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204548.73906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204548.73943: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204548.74032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.74076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.74112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.74165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.74188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.74299: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.74324: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204548.74332: when evaluation is False, skipping this task 46400 1727204548.74338: _execute() done 46400 1727204548.74345: dumping result to json 46400 1727204548.74352: done dumping result, returning 46400 1727204548.74371: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000d1c] 46400 1727204548.74387: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1c 46400 1727204548.74522: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1c skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204548.74581: no more pending results, returning what we have 46400 1727204548.74586: results queue empty 46400 1727204548.74587: checking for any_errors_fatal 46400 1727204548.74596: done checking for any_errors_fatal 46400 1727204548.74597: checking for max_fail_percentage 46400 1727204548.74599: done checking for max_fail_percentage 46400 1727204548.74600: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.74600: done checking to see if all hosts have failed 46400 1727204548.74601: getting the remaining hosts for this loop 46400 1727204548.74603: done getting the remaining hosts for this loop 46400 1727204548.74607: getting the next task for host managed-node2 46400 1727204548.74619: done getting next task for host managed-node2 46400 1727204548.74624: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204548.74629: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.74651: getting variables 46400 1727204548.74653: in VariableManager get_vars() 46400 1727204548.74699: Calling all_inventory to load vars for managed-node2 46400 1727204548.74702: Calling groups_inventory to load vars for managed-node2 46400 1727204548.74705: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.74716: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.74719: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.74722: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.75706: WORKER PROCESS EXITING 46400 1727204548.76592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.78415: done with get_vars() 46400 1727204548.78451: done getting variables 46400 1727204548.78518: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.091) 0:00:39.070 ***** 46400 1727204548.78562: entering _queue_task() for managed-node2/fail 46400 1727204548.78920: worker is 1 (out of 1 available) 46400 1727204548.78933: exiting _queue_task() for managed-node2/fail 46400 1727204548.78946: done queuing things up, now waiting for results queue to drain 46400 1727204548.78948: waiting for pending results... 46400 1727204548.79267: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204548.79428: in run() - task 0affcd87-79f5-1303-fda8-000000000d1d 46400 1727204548.79450: variable 'ansible_search_path' from source: unknown 46400 1727204548.79457: variable 'ansible_search_path' from source: unknown 46400 1727204548.79506: calling self._execute() 46400 1727204548.79614: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.79625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.79638: variable 'omit' from source: magic vars 46400 1727204548.80048: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.80069: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.80208: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204548.80428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204548.82999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204548.83100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204548.83147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204548.83191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204548.83224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204548.83320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.83359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.83397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.83447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.83478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.83532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.83563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.83598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.83641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.83657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.83702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.83724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.83752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.83803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.83821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.84022: variable 'network_connections' from source: include params 46400 1727204548.84040: variable 'interface' from source: play vars 46400 1727204548.84122: variable 'interface' from source: play vars 46400 1727204548.84207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204548.84401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204548.84848: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204548.84892: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204548.84924: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204548.84980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204548.85011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204548.85038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.85077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204548.85135: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204548.85420: variable 'network_connections' from source: include params 46400 1727204548.85433: variable 'interface' from source: play vars 46400 1727204548.85509: variable 'interface' from source: play vars 46400 1727204548.85543: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204548.85551: when evaluation is False, skipping this task 46400 1727204548.85559: _execute() done 46400 1727204548.85570: dumping result to json 46400 1727204548.85577: done dumping result, returning 46400 1727204548.85592: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000d1d] 46400 1727204548.85601: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1d skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204548.85776: no more pending results, returning what we have 46400 1727204548.85780: results queue empty 46400 1727204548.85782: checking for any_errors_fatal 46400 1727204548.85789: done checking for any_errors_fatal 46400 1727204548.85790: checking for max_fail_percentage 46400 1727204548.85792: done checking for max_fail_percentage 46400 1727204548.85792: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.85793: done checking to see if all hosts have failed 46400 1727204548.85794: getting the remaining hosts for this loop 46400 1727204548.85796: done getting the remaining hosts for this loop 46400 1727204548.85801: getting the next task for host managed-node2 46400 1727204548.85810: done getting next task for host managed-node2 46400 1727204548.85815: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204548.85820: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.85840: getting variables 46400 1727204548.85842: in VariableManager get_vars() 46400 1727204548.85885: Calling all_inventory to load vars for managed-node2 46400 1727204548.85888: Calling groups_inventory to load vars for managed-node2 46400 1727204548.85891: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.85902: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.85905: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.85908: Calling groups_plugins_play to load vars for managed-node2 46400 1727204548.86909: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1d 46400 1727204548.86913: WORKER PROCESS EXITING 46400 1727204548.87998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204548.89766: done with get_vars() 46400 1727204548.89801: done getting variables 46400 1727204548.89881: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:28 -0400 (0:00:00.113) 0:00:39.183 ***** 46400 1727204548.89922: entering _queue_task() for managed-node2/package 46400 1727204548.90306: worker is 1 (out of 1 available) 46400 1727204548.90321: exiting _queue_task() for managed-node2/package 46400 1727204548.90336: done queuing things up, now waiting for results queue to drain 46400 1727204548.90338: waiting for pending results... 46400 1727204548.90671: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204548.90845: in run() - task 0affcd87-79f5-1303-fda8-000000000d1e 46400 1727204548.90869: variable 'ansible_search_path' from source: unknown 46400 1727204548.90879: variable 'ansible_search_path' from source: unknown 46400 1727204548.90924: calling self._execute() 46400 1727204548.91039: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204548.91058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204548.91078: variable 'omit' from source: magic vars 46400 1727204548.91495: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.91516: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204548.91742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204548.92043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204548.92104: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204548.92142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204548.92232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204548.92362: variable 'network_packages' from source: role '' defaults 46400 1727204548.92486: variable '__network_provider_setup' from source: role '' defaults 46400 1727204548.92504: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204548.92580: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204548.92594: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204548.92670: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204548.92882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204548.94466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204548.94512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204548.94541: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204548.94571: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204548.94594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204548.94659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.94684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.94704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.94731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.94741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.94779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.94794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.94816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.94855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.94868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.95084: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204548.95247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.95250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.95252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.95283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.95297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.95386: variable 'ansible_python' from source: facts 46400 1727204548.95403: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204548.95488: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204548.95567: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204548.95682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.95705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.95728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.95773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.95782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.95827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204548.95849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204548.95873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.95912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204548.95926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204548.96069: variable 'network_connections' from source: include params 46400 1727204548.96072: variable 'interface' from source: play vars 46400 1727204548.96174: variable 'interface' from source: play vars 46400 1727204548.96241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204548.96273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204548.96298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204548.96337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204548.96376: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204548.96646: variable 'network_connections' from source: include params 46400 1727204548.96649: variable 'interface' from source: play vars 46400 1727204548.96729: variable 'interface' from source: play vars 46400 1727204548.96752: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204548.96812: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204548.97010: variable 'network_connections' from source: include params 46400 1727204548.97014: variable 'interface' from source: play vars 46400 1727204548.97059: variable 'interface' from source: play vars 46400 1727204548.97078: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204548.97134: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204548.97330: variable 'network_connections' from source: include params 46400 1727204548.97334: variable 'interface' from source: play vars 46400 1727204548.97381: variable 'interface' from source: play vars 46400 1727204548.97420: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204548.97467: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204548.97471: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204548.97512: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204548.97653: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204548.97954: variable 'network_connections' from source: include params 46400 1727204548.97958: variable 'interface' from source: play vars 46400 1727204548.98005: variable 'interface' from source: play vars 46400 1727204548.98011: variable 'ansible_distribution' from source: facts 46400 1727204548.98014: variable '__network_rh_distros' from source: role '' defaults 46400 1727204548.98020: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.98030: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204548.98145: variable 'ansible_distribution' from source: facts 46400 1727204548.98149: variable '__network_rh_distros' from source: role '' defaults 46400 1727204548.98151: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.98167: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204548.98274: variable 'ansible_distribution' from source: facts 46400 1727204548.98278: variable '__network_rh_distros' from source: role '' defaults 46400 1727204548.98281: variable 'ansible_distribution_major_version' from source: facts 46400 1727204548.98311: variable 'network_provider' from source: set_fact 46400 1727204548.98323: variable 'ansible_facts' from source: unknown 46400 1727204548.98991: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204548.98994: when evaluation is False, skipping this task 46400 1727204548.98996: _execute() done 46400 1727204548.98998: dumping result to json 46400 1727204548.99000: done dumping result, returning 46400 1727204548.99003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000000d1e] 46400 1727204548.99005: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1e 46400 1727204548.99133: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1e 46400 1727204548.99136: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204548.99188: no more pending results, returning what we have 46400 1727204548.99192: results queue empty 46400 1727204548.99193: checking for any_errors_fatal 46400 1727204548.99201: done checking for any_errors_fatal 46400 1727204548.99201: checking for max_fail_percentage 46400 1727204548.99203: done checking for max_fail_percentage 46400 1727204548.99204: checking to see if all hosts have failed and the running result is not ok 46400 1727204548.99205: done checking to see if all hosts have failed 46400 1727204548.99206: getting the remaining hosts for this loop 46400 1727204548.99207: done getting the remaining hosts for this loop 46400 1727204548.99212: getting the next task for host managed-node2 46400 1727204548.99219: done getting next task for host managed-node2 46400 1727204548.99223: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204548.99228: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204548.99247: getting variables 46400 1727204548.99249: in VariableManager get_vars() 46400 1727204548.99298: Calling all_inventory to load vars for managed-node2 46400 1727204548.99301: Calling groups_inventory to load vars for managed-node2 46400 1727204548.99303: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204548.99312: Calling all_plugins_play to load vars for managed-node2 46400 1727204548.99314: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204548.99316: Calling groups_plugins_play to load vars for managed-node2 46400 1727204549.00908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204549.02787: done with get_vars() 46400 1727204549.02817: done getting variables 46400 1727204549.02896: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:29 -0400 (0:00:00.130) 0:00:39.313 ***** 46400 1727204549.02934: entering _queue_task() for managed-node2/package 46400 1727204549.03319: worker is 1 (out of 1 available) 46400 1727204549.03332: exiting _queue_task() for managed-node2/package 46400 1727204549.03344: done queuing things up, now waiting for results queue to drain 46400 1727204549.03346: waiting for pending results... 46400 1727204549.03655: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204549.03824: in run() - task 0affcd87-79f5-1303-fda8-000000000d1f 46400 1727204549.03844: variable 'ansible_search_path' from source: unknown 46400 1727204549.03853: variable 'ansible_search_path' from source: unknown 46400 1727204549.03910: calling self._execute() 46400 1727204549.04023: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.04036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.04051: variable 'omit' from source: magic vars 46400 1727204549.04498: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.04516: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204549.04675: variable 'network_state' from source: role '' defaults 46400 1727204549.04689: Evaluated conditional (network_state != {}): False 46400 1727204549.04696: when evaluation is False, skipping this task 46400 1727204549.04703: _execute() done 46400 1727204549.04709: dumping result to json 46400 1727204549.04715: done dumping result, returning 46400 1727204549.04725: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000d1f] 46400 1727204549.04736: sending task result for task 0affcd87-79f5-1303-fda8-000000000d1f 46400 1727204549.04877: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d1f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204549.04931: no more pending results, returning what we have 46400 1727204549.04936: results queue empty 46400 1727204549.04937: checking for any_errors_fatal 46400 1727204549.04943: done checking for any_errors_fatal 46400 1727204549.04944: checking for max_fail_percentage 46400 1727204549.04945: done checking for max_fail_percentage 46400 1727204549.04946: checking to see if all hosts have failed and the running result is not ok 46400 1727204549.04947: done checking to see if all hosts have failed 46400 1727204549.04948: getting the remaining hosts for this loop 46400 1727204549.04950: done getting the remaining hosts for this loop 46400 1727204549.04954: getting the next task for host managed-node2 46400 1727204549.04970: done getting next task for host managed-node2 46400 1727204549.04976: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204549.04983: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204549.05007: getting variables 46400 1727204549.05009: in VariableManager get_vars() 46400 1727204549.05048: Calling all_inventory to load vars for managed-node2 46400 1727204549.05051: Calling groups_inventory to load vars for managed-node2 46400 1727204549.05053: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204549.05071: Calling all_plugins_play to load vars for managed-node2 46400 1727204549.05076: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204549.05079: Calling groups_plugins_play to load vars for managed-node2 46400 1727204549.06117: WORKER PROCESS EXITING 46400 1727204549.07406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204549.09439: done with get_vars() 46400 1727204549.09483: done getting variables 46400 1727204549.09551: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:29 -0400 (0:00:00.066) 0:00:39.380 ***** 46400 1727204549.09604: entering _queue_task() for managed-node2/package 46400 1727204549.09994: worker is 1 (out of 1 available) 46400 1727204549.10008: exiting _queue_task() for managed-node2/package 46400 1727204549.10025: done queuing things up, now waiting for results queue to drain 46400 1727204549.10027: waiting for pending results... 46400 1727204549.10354: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204549.10537: in run() - task 0affcd87-79f5-1303-fda8-000000000d20 46400 1727204549.10557: variable 'ansible_search_path' from source: unknown 46400 1727204549.10570: variable 'ansible_search_path' from source: unknown 46400 1727204549.10658: calling self._execute() 46400 1727204549.10787: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.10818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.10833: variable 'omit' from source: magic vars 46400 1727204549.12401: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.12420: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204549.12667: variable 'network_state' from source: role '' defaults 46400 1727204549.12814: Evaluated conditional (network_state != {}): False 46400 1727204549.12821: when evaluation is False, skipping this task 46400 1727204549.12827: _execute() done 46400 1727204549.12834: dumping result to json 46400 1727204549.12841: done dumping result, returning 46400 1727204549.12852: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000000d20] 46400 1727204549.12865: sending task result for task 0affcd87-79f5-1303-fda8-000000000d20 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204549.13027: no more pending results, returning what we have 46400 1727204549.13032: results queue empty 46400 1727204549.13033: checking for any_errors_fatal 46400 1727204549.13041: done checking for any_errors_fatal 46400 1727204549.13042: checking for max_fail_percentage 46400 1727204549.13044: done checking for max_fail_percentage 46400 1727204549.13045: checking to see if all hosts have failed and the running result is not ok 46400 1727204549.13046: done checking to see if all hosts have failed 46400 1727204549.13047: getting the remaining hosts for this loop 46400 1727204549.13048: done getting the remaining hosts for this loop 46400 1727204549.13053: getting the next task for host managed-node2 46400 1727204549.13067: done getting next task for host managed-node2 46400 1727204549.13071: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204549.13078: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204549.13102: getting variables 46400 1727204549.13104: in VariableManager get_vars() 46400 1727204549.13143: Calling all_inventory to load vars for managed-node2 46400 1727204549.13146: Calling groups_inventory to load vars for managed-node2 46400 1727204549.13149: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204549.13166: Calling all_plugins_play to load vars for managed-node2 46400 1727204549.13170: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204549.13174: Calling groups_plugins_play to load vars for managed-node2 46400 1727204549.14234: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d20 46400 1727204549.14238: WORKER PROCESS EXITING 46400 1727204549.15577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204549.18651: done with get_vars() 46400 1727204549.18691: done getting variables 46400 1727204549.18757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:29 -0400 (0:00:00.092) 0:00:39.472 ***** 46400 1727204549.18819: entering _queue_task() for managed-node2/service 46400 1727204549.19182: worker is 1 (out of 1 available) 46400 1727204549.19195: exiting _queue_task() for managed-node2/service 46400 1727204549.19209: done queuing things up, now waiting for results queue to drain 46400 1727204549.19210: waiting for pending results... 46400 1727204549.19515: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204549.19641: in run() - task 0affcd87-79f5-1303-fda8-000000000d21 46400 1727204549.19658: variable 'ansible_search_path' from source: unknown 46400 1727204549.19668: variable 'ansible_search_path' from source: unknown 46400 1727204549.19698: calling self._execute() 46400 1727204549.19789: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.19793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.19803: variable 'omit' from source: magic vars 46400 1727204549.20240: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.20275: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204549.20422: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204549.20715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204549.23567: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204549.23641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204549.23681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204549.23716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204549.23743: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204549.23818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.23849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.23878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.23918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.23931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.24031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.24034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.24037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.24371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.24374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.24377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.24379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.24382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.24384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.24387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.24405: variable 'network_connections' from source: include params 46400 1727204549.24418: variable 'interface' from source: play vars 46400 1727204549.24492: variable 'interface' from source: play vars 46400 1727204549.24567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204549.24739: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204549.24775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204549.24817: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204549.24850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204549.24896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204549.24915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204549.24938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.24966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204549.25008: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204549.25277: variable 'network_connections' from source: include params 46400 1727204549.25281: variable 'interface' from source: play vars 46400 1727204549.25369: variable 'interface' from source: play vars 46400 1727204549.25397: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204549.25400: when evaluation is False, skipping this task 46400 1727204549.25403: _execute() done 46400 1727204549.25405: dumping result to json 46400 1727204549.25407: done dumping result, returning 46400 1727204549.25416: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000000d21] 46400 1727204549.25422: sending task result for task 0affcd87-79f5-1303-fda8-000000000d21 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204549.25567: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d21 46400 1727204549.25589: no more pending results, returning what we have 46400 1727204549.25596: results queue empty 46400 1727204549.25597: checking for any_errors_fatal 46400 1727204549.25603: done checking for any_errors_fatal 46400 1727204549.25604: checking for max_fail_percentage 46400 1727204549.25606: done checking for max_fail_percentage 46400 1727204549.25607: checking to see if all hosts have failed and the running result is not ok 46400 1727204549.25607: done checking to see if all hosts have failed 46400 1727204549.25608: getting the remaining hosts for this loop 46400 1727204549.25610: done getting the remaining hosts for this loop 46400 1727204549.25614: getting the next task for host managed-node2 46400 1727204549.25623: done getting next task for host managed-node2 46400 1727204549.25627: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204549.25632: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204549.25657: getting variables 46400 1727204549.25659: in VariableManager get_vars() 46400 1727204549.25700: Calling all_inventory to load vars for managed-node2 46400 1727204549.25703: Calling groups_inventory to load vars for managed-node2 46400 1727204549.25705: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204549.25717: Calling all_plugins_play to load vars for managed-node2 46400 1727204549.25720: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204549.25724: Calling groups_plugins_play to load vars for managed-node2 46400 1727204549.26672: WORKER PROCESS EXITING 46400 1727204549.27688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204549.29529: done with get_vars() 46400 1727204549.29570: done getting variables 46400 1727204549.29636: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:29 -0400 (0:00:00.108) 0:00:39.581 ***** 46400 1727204549.29682: entering _queue_task() for managed-node2/service 46400 1727204549.30068: worker is 1 (out of 1 available) 46400 1727204549.30092: exiting _queue_task() for managed-node2/service 46400 1727204549.30108: done queuing things up, now waiting for results queue to drain 46400 1727204549.30110: waiting for pending results... 46400 1727204549.30436: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204549.30606: in run() - task 0affcd87-79f5-1303-fda8-000000000d22 46400 1727204549.30632: variable 'ansible_search_path' from source: unknown 46400 1727204549.30642: variable 'ansible_search_path' from source: unknown 46400 1727204549.30692: calling self._execute() 46400 1727204549.30808: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.30820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.30835: variable 'omit' from source: magic vars 46400 1727204549.31273: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.31296: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204549.31468: variable 'network_provider' from source: set_fact 46400 1727204549.31478: variable 'network_state' from source: role '' defaults 46400 1727204549.31492: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204549.31504: variable 'omit' from source: magic vars 46400 1727204549.31577: variable 'omit' from source: magic vars 46400 1727204549.31611: variable 'network_service_name' from source: role '' defaults 46400 1727204549.31688: variable 'network_service_name' from source: role '' defaults 46400 1727204549.31809: variable '__network_provider_setup' from source: role '' defaults 46400 1727204549.31824: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204549.31899: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204549.31911: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204549.31987: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204549.32222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204549.35018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204549.35110: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204549.35155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204549.35198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204549.35240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204549.35332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.35374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.35407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.35467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.35489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.35542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.35579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.35607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.35656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.35684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.35941: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204549.36082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.36112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.36136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.36176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.36196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.36292: variable 'ansible_python' from source: facts 46400 1727204549.36319: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204549.36412: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204549.36505: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204549.36647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.36678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.36705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.36758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.36782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.36830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204549.36879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204549.36909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.36958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204549.36985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204549.37141: variable 'network_connections' from source: include params 46400 1727204549.37153: variable 'interface' from source: play vars 46400 1727204549.37241: variable 'interface' from source: play vars 46400 1727204549.37354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204549.37587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204549.37646: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204549.37696: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204549.37745: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204549.37814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204549.37854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204549.37895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204549.37930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204549.37994: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204549.38304: variable 'network_connections' from source: include params 46400 1727204549.38315: variable 'interface' from source: play vars 46400 1727204549.38401: variable 'interface' from source: play vars 46400 1727204549.38436: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204549.38531: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204549.38854: variable 'network_connections' from source: include params 46400 1727204549.38869: variable 'interface' from source: play vars 46400 1727204549.38946: variable 'interface' from source: play vars 46400 1727204549.38979: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204549.39067: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204549.39340: variable 'network_connections' from source: include params 46400 1727204549.39350: variable 'interface' from source: play vars 46400 1727204549.39476: variable 'interface' from source: play vars 46400 1727204549.39538: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204549.39616: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204549.39629: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204549.39706: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204549.39944: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204549.41087: variable 'network_connections' from source: include params 46400 1727204549.41098: variable 'interface' from source: play vars 46400 1727204549.41168: variable 'interface' from source: play vars 46400 1727204549.41188: variable 'ansible_distribution' from source: facts 46400 1727204549.41199: variable '__network_rh_distros' from source: role '' defaults 46400 1727204549.41227: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.41248: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204549.41453: variable 'ansible_distribution' from source: facts 46400 1727204549.41469: variable '__network_rh_distros' from source: role '' defaults 46400 1727204549.41480: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.41496: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204549.41686: variable 'ansible_distribution' from source: facts 46400 1727204549.41697: variable '__network_rh_distros' from source: role '' defaults 46400 1727204549.41707: variable 'ansible_distribution_major_version' from source: facts 46400 1727204549.41757: variable 'network_provider' from source: set_fact 46400 1727204549.41789: variable 'omit' from source: magic vars 46400 1727204549.41824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204549.41869: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204549.41894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204549.41918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204549.41934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204549.41979: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204549.41987: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.41994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.42102: Set connection var ansible_shell_type to sh 46400 1727204549.42119: Set connection var ansible_shell_executable to /bin/sh 46400 1727204549.42129: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204549.42138: Set connection var ansible_connection to ssh 46400 1727204549.42147: Set connection var ansible_pipelining to False 46400 1727204549.42166: Set connection var ansible_timeout to 10 46400 1727204549.42202: variable 'ansible_shell_executable' from source: unknown 46400 1727204549.42210: variable 'ansible_connection' from source: unknown 46400 1727204549.42217: variable 'ansible_module_compression' from source: unknown 46400 1727204549.42223: variable 'ansible_shell_type' from source: unknown 46400 1727204549.42229: variable 'ansible_shell_executable' from source: unknown 46400 1727204549.42277: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204549.42290: variable 'ansible_pipelining' from source: unknown 46400 1727204549.42298: variable 'ansible_timeout' from source: unknown 46400 1727204549.42306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204549.42494: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204549.42520: variable 'omit' from source: magic vars 46400 1727204549.42529: starting attempt loop 46400 1727204549.42534: running the handler 46400 1727204549.42681: variable 'ansible_facts' from source: unknown 46400 1727204549.43533: _low_level_execute_command(): starting 46400 1727204549.43547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204549.44355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204549.44384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.44401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.44422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.44489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204549.44501: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204549.44514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.44530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204549.44539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204549.44549: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204549.44562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.44602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.44630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.44642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204549.44677: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204549.44707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.45545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204549.45648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204549.45670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.45853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.47407: stdout chunk (state=3): >>>/root <<< 46400 1727204549.47582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204549.47622: stderr chunk (state=3): >>><<< 46400 1727204549.47625: stdout chunk (state=3): >>><<< 46400 1727204549.47741: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204549.47746: _low_level_execute_command(): starting 46400 1727204549.47749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270 `" && echo ansible-tmp-1727204549.4764717-49339-257910234636270="` echo /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270 `" ) && sleep 0' 46400 1727204549.49228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.49232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.49373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204549.49378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204549.49381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.49383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.49438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204549.49580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.49678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.51530: stdout chunk (state=3): >>>ansible-tmp-1727204549.4764717-49339-257910234636270=/root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270 <<< 46400 1727204549.51715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204549.51719: stdout chunk (state=3): >>><<< 46400 1727204549.51726: stderr chunk (state=3): >>><<< 46400 1727204549.51747: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204549.4764717-49339-257910234636270=/root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204549.51784: variable 'ansible_module_compression' from source: unknown 46400 1727204549.51839: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204549.51900: variable 'ansible_facts' from source: unknown 46400 1727204549.52119: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/AnsiballZ_systemd.py 46400 1727204549.53021: Sending initial data 46400 1727204549.53024: Sent initial data (156 bytes) 46400 1727204549.55693: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.55784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204549.55839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.55928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.57644: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204549.57681: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204549.57724: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp3hrvkvji /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/AnsiballZ_systemd.py <<< 46400 1727204549.57759: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204549.60683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204549.60887: stderr chunk (state=3): >>><<< 46400 1727204549.60891: stdout chunk (state=3): >>><<< 46400 1727204549.60892: done transferring module to remote 46400 1727204549.60894: _low_level_execute_command(): starting 46400 1727204549.60896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/ /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/AnsiballZ_systemd.py && sleep 0' 46400 1727204549.61749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.61753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.61799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204549.61804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.61806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204549.61808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.61874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204549.61878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204549.61891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.61933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.63649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204549.63731: stderr chunk (state=3): >>><<< 46400 1727204549.63736: stdout chunk (state=3): >>><<< 46400 1727204549.63834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204549.63837: _low_level_execute_command(): starting 46400 1727204549.63840: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/AnsiballZ_systemd.py && sleep 0' 46400 1727204549.65301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.65305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.65337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204549.65340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.65342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204549.65345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.65604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204549.65679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204549.65682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.65753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.90940: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204549.90978: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "7012352", "MemoryAvailable": "infinity", "CPUUsageNSec": "2076076000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204549.90991: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204549.92447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204549.92543: stderr chunk (state=3): >>><<< 46400 1727204549.92549: stdout chunk (state=3): >>><<< 46400 1727204549.92834: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "7012352", "MemoryAvailable": "infinity", "CPUUsageNSec": "2076076000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204549.92845: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204549.92849: _low_level_execute_command(): starting 46400 1727204549.92852: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204549.4764717-49339-257910234636270/ > /dev/null 2>&1 && sleep 0' 46400 1727204549.93426: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204549.93442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.93458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.93481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.93525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204549.93538: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204549.93553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.93577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204549.93590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204549.93602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204549.93614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204549.93629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204549.93646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204549.93659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204549.93678: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204549.93692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204549.93769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204549.93787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204549.93801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204549.94181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204549.96385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204549.96393: stdout chunk (state=3): >>><<< 46400 1727204549.96396: stderr chunk (state=3): >>><<< 46400 1727204549.96399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204549.96401: handler run complete 46400 1727204549.96404: attempt loop complete, returning result 46400 1727204549.96406: _execute() done 46400 1727204549.96408: dumping result to json 46400 1727204549.96409: done dumping result, returning 46400 1727204549.96412: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000000d22] 46400 1727204549.96414: sending task result for task 0affcd87-79f5-1303-fda8-000000000d22 46400 1727204549.96587: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d22 46400 1727204549.96590: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204549.96642: no more pending results, returning what we have 46400 1727204549.96645: results queue empty 46400 1727204549.96646: checking for any_errors_fatal 46400 1727204549.96651: done checking for any_errors_fatal 46400 1727204549.96652: checking for max_fail_percentage 46400 1727204549.96653: done checking for max_fail_percentage 46400 1727204549.96654: checking to see if all hosts have failed and the running result is not ok 46400 1727204549.96655: done checking to see if all hosts have failed 46400 1727204549.96655: getting the remaining hosts for this loop 46400 1727204549.96657: done getting the remaining hosts for this loop 46400 1727204549.96663: getting the next task for host managed-node2 46400 1727204549.96671: done getting next task for host managed-node2 46400 1727204549.96675: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204549.96680: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204549.96690: getting variables 46400 1727204549.96691: in VariableManager get_vars() 46400 1727204549.96722: Calling all_inventory to load vars for managed-node2 46400 1727204549.96725: Calling groups_inventory to load vars for managed-node2 46400 1727204549.96727: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204549.96736: Calling all_plugins_play to load vars for managed-node2 46400 1727204549.96739: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204549.96742: Calling groups_plugins_play to load vars for managed-node2 46400 1727204549.98468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204550.00539: done with get_vars() 46400 1727204550.00613: done getting variables 46400 1727204550.00686: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:30 -0400 (0:00:00.710) 0:00:40.292 ***** 46400 1727204550.00746: entering _queue_task() for managed-node2/service 46400 1727204550.01145: worker is 1 (out of 1 available) 46400 1727204550.01157: exiting _queue_task() for managed-node2/service 46400 1727204550.01175: done queuing things up, now waiting for results queue to drain 46400 1727204550.01177: waiting for pending results... 46400 1727204550.01514: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204550.01702: in run() - task 0affcd87-79f5-1303-fda8-000000000d23 46400 1727204550.01726: variable 'ansible_search_path' from source: unknown 46400 1727204550.01739: variable 'ansible_search_path' from source: unknown 46400 1727204550.01787: calling self._execute() 46400 1727204550.01907: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.01973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.01988: variable 'omit' from source: magic vars 46400 1727204550.02437: variable 'ansible_distribution_major_version' from source: facts 46400 1727204550.02465: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204550.02604: variable 'network_provider' from source: set_fact 46400 1727204550.02616: Evaluated conditional (network_provider == "nm"): True 46400 1727204550.02754: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204550.02865: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204550.03065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204550.05838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204550.05929: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204550.05982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204550.06049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204550.06090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204550.06221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204550.06256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204550.06299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204550.06363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204550.06404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204550.06476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204550.06511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204550.06545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204550.06593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204550.06624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204550.06692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204550.06736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204550.06781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204550.06842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204550.06881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204550.07055: variable 'network_connections' from source: include params 46400 1727204550.07086: variable 'interface' from source: play vars 46400 1727204550.07172: variable 'interface' from source: play vars 46400 1727204550.07293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204550.07530: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204550.07585: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204550.07628: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204550.07671: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204550.07745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204550.07790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204550.07830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204550.07881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204550.07952: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204550.08286: variable 'network_connections' from source: include params 46400 1727204550.08299: variable 'interface' from source: play vars 46400 1727204550.08374: variable 'interface' from source: play vars 46400 1727204550.08413: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204550.08421: when evaluation is False, skipping this task 46400 1727204550.08428: _execute() done 46400 1727204550.08435: dumping result to json 46400 1727204550.08443: done dumping result, returning 46400 1727204550.08455: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000000d23] 46400 1727204550.08492: sending task result for task 0affcd87-79f5-1303-fda8-000000000d23 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204550.08689: no more pending results, returning what we have 46400 1727204550.08694: results queue empty 46400 1727204550.08695: checking for any_errors_fatal 46400 1727204550.08732: done checking for any_errors_fatal 46400 1727204550.08734: checking for max_fail_percentage 46400 1727204550.08736: done checking for max_fail_percentage 46400 1727204550.08737: checking to see if all hosts have failed and the running result is not ok 46400 1727204550.08737: done checking to see if all hosts have failed 46400 1727204550.08738: getting the remaining hosts for this loop 46400 1727204550.08740: done getting the remaining hosts for this loop 46400 1727204550.08745: getting the next task for host managed-node2 46400 1727204550.08763: done getting next task for host managed-node2 46400 1727204550.08769: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204550.08775: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204550.08796: getting variables 46400 1727204550.08798: in VariableManager get_vars() 46400 1727204550.08842: Calling all_inventory to load vars for managed-node2 46400 1727204550.08845: Calling groups_inventory to load vars for managed-node2 46400 1727204550.08848: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204550.08862: Calling all_plugins_play to load vars for managed-node2 46400 1727204550.08887: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204550.08892: Calling groups_plugins_play to load vars for managed-node2 46400 1727204550.09855: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d23 46400 1727204550.09858: WORKER PROCESS EXITING 46400 1727204550.10944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204550.12962: done with get_vars() 46400 1727204550.13001: done getting variables 46400 1727204550.13080: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:30 -0400 (0:00:00.123) 0:00:40.415 ***** 46400 1727204550.13116: entering _queue_task() for managed-node2/service 46400 1727204550.13497: worker is 1 (out of 1 available) 46400 1727204550.13510: exiting _queue_task() for managed-node2/service 46400 1727204550.13524: done queuing things up, now waiting for results queue to drain 46400 1727204550.13525: waiting for pending results... 46400 1727204550.13989: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204550.14097: in run() - task 0affcd87-79f5-1303-fda8-000000000d24 46400 1727204550.14108: variable 'ansible_search_path' from source: unknown 46400 1727204550.14112: variable 'ansible_search_path' from source: unknown 46400 1727204550.14148: calling self._execute() 46400 1727204550.14229: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.14233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.14242: variable 'omit' from source: magic vars 46400 1727204550.14529: variable 'ansible_distribution_major_version' from source: facts 46400 1727204550.14540: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204550.14628: variable 'network_provider' from source: set_fact 46400 1727204550.14632: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204550.14635: when evaluation is False, skipping this task 46400 1727204550.14637: _execute() done 46400 1727204550.14640: dumping result to json 46400 1727204550.14643: done dumping result, returning 46400 1727204550.14651: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000000d24] 46400 1727204550.14655: sending task result for task 0affcd87-79f5-1303-fda8-000000000d24 46400 1727204550.14751: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d24 46400 1727204550.14753: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204550.14807: no more pending results, returning what we have 46400 1727204550.14811: results queue empty 46400 1727204550.14812: checking for any_errors_fatal 46400 1727204550.14821: done checking for any_errors_fatal 46400 1727204550.14822: checking for max_fail_percentage 46400 1727204550.14824: done checking for max_fail_percentage 46400 1727204550.14825: checking to see if all hosts have failed and the running result is not ok 46400 1727204550.14826: done checking to see if all hosts have failed 46400 1727204550.14827: getting the remaining hosts for this loop 46400 1727204550.14829: done getting the remaining hosts for this loop 46400 1727204550.14833: getting the next task for host managed-node2 46400 1727204550.14843: done getting next task for host managed-node2 46400 1727204550.14847: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204550.14851: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204550.14878: getting variables 46400 1727204550.14880: in VariableManager get_vars() 46400 1727204550.14913: Calling all_inventory to load vars for managed-node2 46400 1727204550.14916: Calling groups_inventory to load vars for managed-node2 46400 1727204550.14918: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204550.14928: Calling all_plugins_play to load vars for managed-node2 46400 1727204550.14930: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204550.14933: Calling groups_plugins_play to load vars for managed-node2 46400 1727204550.16077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204550.17494: done with get_vars() 46400 1727204550.17518: done getting variables 46400 1727204550.17569: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:30 -0400 (0:00:00.044) 0:00:40.460 ***** 46400 1727204550.17598: entering _queue_task() for managed-node2/copy 46400 1727204550.17845: worker is 1 (out of 1 available) 46400 1727204550.17858: exiting _queue_task() for managed-node2/copy 46400 1727204550.17873: done queuing things up, now waiting for results queue to drain 46400 1727204550.17875: waiting for pending results... 46400 1727204550.18079: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204550.18209: in run() - task 0affcd87-79f5-1303-fda8-000000000d25 46400 1727204550.18219: variable 'ansible_search_path' from source: unknown 46400 1727204550.18223: variable 'ansible_search_path' from source: unknown 46400 1727204550.18255: calling self._execute() 46400 1727204550.18331: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.18335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.18345: variable 'omit' from source: magic vars 46400 1727204550.18624: variable 'ansible_distribution_major_version' from source: facts 46400 1727204550.18634: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204550.18720: variable 'network_provider' from source: set_fact 46400 1727204550.18724: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204550.18728: when evaluation is False, skipping this task 46400 1727204550.18731: _execute() done 46400 1727204550.18733: dumping result to json 46400 1727204550.18736: done dumping result, returning 46400 1727204550.18742: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000000d25] 46400 1727204550.18749: sending task result for task 0affcd87-79f5-1303-fda8-000000000d25 46400 1727204550.18880: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d25 46400 1727204550.18884: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204550.19089: no more pending results, returning what we have 46400 1727204550.19093: results queue empty 46400 1727204550.19094: checking for any_errors_fatal 46400 1727204550.19099: done checking for any_errors_fatal 46400 1727204550.19100: checking for max_fail_percentage 46400 1727204550.19102: done checking for max_fail_percentage 46400 1727204550.19103: checking to see if all hosts have failed and the running result is not ok 46400 1727204550.19103: done checking to see if all hosts have failed 46400 1727204550.19104: getting the remaining hosts for this loop 46400 1727204550.19106: done getting the remaining hosts for this loop 46400 1727204550.19109: getting the next task for host managed-node2 46400 1727204550.19117: done getting next task for host managed-node2 46400 1727204550.19121: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204550.19179: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204550.19203: getting variables 46400 1727204550.19205: in VariableManager get_vars() 46400 1727204550.19241: Calling all_inventory to load vars for managed-node2 46400 1727204550.19245: Calling groups_inventory to load vars for managed-node2 46400 1727204550.19247: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204550.19258: Calling all_plugins_play to load vars for managed-node2 46400 1727204550.19267: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204550.19272: Calling groups_plugins_play to load vars for managed-node2 46400 1727204550.21743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204550.24930: done with get_vars() 46400 1727204550.25085: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:30 -0400 (0:00:00.076) 0:00:40.537 ***** 46400 1727204550.25297: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204550.26053: worker is 1 (out of 1 available) 46400 1727204550.26142: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204550.26156: done queuing things up, now waiting for results queue to drain 46400 1727204550.26158: waiting for pending results... 46400 1727204550.27148: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204550.27284: in run() - task 0affcd87-79f5-1303-fda8-000000000d26 46400 1727204550.27298: variable 'ansible_search_path' from source: unknown 46400 1727204550.27302: variable 'ansible_search_path' from source: unknown 46400 1727204550.27337: calling self._execute() 46400 1727204550.27428: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.27432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.27442: variable 'omit' from source: magic vars 46400 1727204550.28527: variable 'ansible_distribution_major_version' from source: facts 46400 1727204550.28539: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204550.28547: variable 'omit' from source: magic vars 46400 1727204550.28621: variable 'omit' from source: magic vars 46400 1727204550.28783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204550.33905: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204550.33972: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204550.34012: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204550.34046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204550.34074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204550.34150: variable 'network_provider' from source: set_fact 46400 1727204550.34282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204550.34309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204550.34334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204550.34375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204550.34389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204550.34467: variable 'omit' from source: magic vars 46400 1727204550.35277: variable 'omit' from source: magic vars 46400 1727204550.35387: variable 'network_connections' from source: include params 46400 1727204550.35400: variable 'interface' from source: play vars 46400 1727204550.35466: variable 'interface' from source: play vars 46400 1727204550.35607: variable 'omit' from source: magic vars 46400 1727204550.35616: variable '__lsr_ansible_managed' from source: task vars 46400 1727204550.35676: variable '__lsr_ansible_managed' from source: task vars 46400 1727204550.35876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204550.36787: Loaded config def from plugin (lookup/template) 46400 1727204550.36792: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204550.36820: File lookup term: get_ansible_managed.j2 46400 1727204550.36824: variable 'ansible_search_path' from source: unknown 46400 1727204550.36827: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204550.36842: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204550.36859: variable 'ansible_search_path' from source: unknown 46400 1727204550.46786: variable 'ansible_managed' from source: unknown 46400 1727204550.46947: variable 'omit' from source: magic vars 46400 1727204550.46996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204550.47027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204550.47052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204550.47090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204550.47106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204550.47138: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204550.47147: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.47155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.47268: Set connection var ansible_shell_type to sh 46400 1727204550.47295: Set connection var ansible_shell_executable to /bin/sh 46400 1727204550.47306: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204550.47316: Set connection var ansible_connection to ssh 46400 1727204550.47325: Set connection var ansible_pipelining to False 46400 1727204550.47334: Set connection var ansible_timeout to 10 46400 1727204550.47369: variable 'ansible_shell_executable' from source: unknown 46400 1727204550.47378: variable 'ansible_connection' from source: unknown 46400 1727204550.47391: variable 'ansible_module_compression' from source: unknown 46400 1727204550.47406: variable 'ansible_shell_type' from source: unknown 46400 1727204550.47414: variable 'ansible_shell_executable' from source: unknown 46400 1727204550.47421: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.47429: variable 'ansible_pipelining' from source: unknown 46400 1727204550.47436: variable 'ansible_timeout' from source: unknown 46400 1727204550.47443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.47598: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204550.47632: variable 'omit' from source: magic vars 46400 1727204550.47643: starting attempt loop 46400 1727204550.47649: running the handler 46400 1727204550.47676: _low_level_execute_command(): starting 46400 1727204550.47688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204550.48551: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.48581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.48595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.48621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.48688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.48699: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.48710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.48742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.48756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.48774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.48801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.48817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.48829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.48848: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.48868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.48935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.48969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.48987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.49066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.50721: stdout chunk (state=3): >>>/root <<< 46400 1727204550.50888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204550.50930: stderr chunk (state=3): >>><<< 46400 1727204550.50934: stdout chunk (state=3): >>><<< 46400 1727204550.51054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204550.51058: _low_level_execute_command(): starting 46400 1727204550.51063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909 `" && echo ansible-tmp-1727204550.5095356-49376-39657412484909="` echo /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909 `" ) && sleep 0' 46400 1727204550.51686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.51702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.51724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.51742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.51791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.51804: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.51827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.51845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.51857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.51872: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.51885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.51898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.51915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.51936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.51949: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.51967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.52050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.52078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.52096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.52214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.54068: stdout chunk (state=3): >>>ansible-tmp-1727204550.5095356-49376-39657412484909=/root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909 <<< 46400 1727204550.54266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204550.54270: stdout chunk (state=3): >>><<< 46400 1727204550.54273: stderr chunk (state=3): >>><<< 46400 1727204550.54613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204550.5095356-49376-39657412484909=/root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204550.54621: variable 'ansible_module_compression' from source: unknown 46400 1727204550.54623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204550.54625: variable 'ansible_facts' from source: unknown 46400 1727204550.54669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/AnsiballZ_network_connections.py 46400 1727204550.55430: Sending initial data 46400 1727204550.55435: Sent initial data (167 bytes) 46400 1727204550.57210: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.57402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.57420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.57440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.57486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.57499: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.57515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.57536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.57548: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.57559: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.57576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.57590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.57606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.57618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.57629: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.57644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.57720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.57743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.57760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.57836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.59589: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204550.59632: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204550.59681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp00ye7jd9 /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/AnsiballZ_network_connections.py <<< 46400 1727204550.59718: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204550.61327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204550.61534: stderr chunk (state=3): >>><<< 46400 1727204550.61537: stdout chunk (state=3): >>><<< 46400 1727204550.61540: done transferring module to remote 46400 1727204550.61542: _low_level_execute_command(): starting 46400 1727204550.61544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/ /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/AnsiballZ_network_connections.py && sleep 0' 46400 1727204550.62151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.62159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.62181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.62194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.62236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.62243: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.62253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.62270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.62278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.62285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.62291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.62300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.62312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.62321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.62328: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.62341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.62416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.62430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.62440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.62511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.64249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204550.64355: stderr chunk (state=3): >>><<< 46400 1727204550.64375: stdout chunk (state=3): >>><<< 46400 1727204550.64478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204550.64486: _low_level_execute_command(): starting 46400 1727204550.64488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/AnsiballZ_network_connections.py && sleep 0' 46400 1727204550.65085: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.65097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.65101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.65124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.65153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.65157: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.65182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.65186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.65193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.65199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.65207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.65216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.65227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.65236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.65241: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.65252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.65336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.65340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.65401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.65474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.87358: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204550.88788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204550.88884: stderr chunk (state=3): >>><<< 46400 1727204550.88887: stdout chunk (state=3): >>><<< 46400 1727204550.88910: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204550.88947: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204550.88956: _low_level_execute_command(): starting 46400 1727204550.88962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204550.5095356-49376-39657412484909/ > /dev/null 2>&1 && sleep 0' 46400 1727204550.89660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204550.89677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.89689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.89710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.89751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.89758: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204550.89773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.89787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204550.89795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204550.89803: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204550.89819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204550.89830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204550.89843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204550.89851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204550.89859: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204550.89872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204550.89950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204550.89968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204550.89986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204550.90068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204550.91872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204550.91973: stderr chunk (state=3): >>><<< 46400 1727204550.91991: stdout chunk (state=3): >>><<< 46400 1727204550.92375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204550.92380: handler run complete 46400 1727204550.92383: attempt loop complete, returning result 46400 1727204550.92385: _execute() done 46400 1727204550.92388: dumping result to json 46400 1727204550.92390: done dumping result, returning 46400 1727204550.92392: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000000d26] 46400 1727204550.92394: sending task result for task 0affcd87-79f5-1303-fda8-000000000d26 46400 1727204550.92477: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d26 46400 1727204550.92481: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active 46400 1727204550.92585: no more pending results, returning what we have 46400 1727204550.92589: results queue empty 46400 1727204550.92592: checking for any_errors_fatal 46400 1727204550.92599: done checking for any_errors_fatal 46400 1727204550.92600: checking for max_fail_percentage 46400 1727204550.92602: done checking for max_fail_percentage 46400 1727204550.92603: checking to see if all hosts have failed and the running result is not ok 46400 1727204550.92604: done checking to see if all hosts have failed 46400 1727204550.92604: getting the remaining hosts for this loop 46400 1727204550.92606: done getting the remaining hosts for this loop 46400 1727204550.92610: getting the next task for host managed-node2 46400 1727204550.92617: done getting next task for host managed-node2 46400 1727204550.92621: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204550.92627: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204550.92637: getting variables 46400 1727204550.92639: in VariableManager get_vars() 46400 1727204550.92678: Calling all_inventory to load vars for managed-node2 46400 1727204550.92681: Calling groups_inventory to load vars for managed-node2 46400 1727204550.92684: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204550.92695: Calling all_plugins_play to load vars for managed-node2 46400 1727204550.92698: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204550.92701: Calling groups_plugins_play to load vars for managed-node2 46400 1727204550.94361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204550.96103: done with get_vars() 46400 1727204550.96136: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:30 -0400 (0:00:00.709) 0:00:41.246 ***** 46400 1727204550.96231: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204550.96590: worker is 1 (out of 1 available) 46400 1727204550.96603: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204550.96616: done queuing things up, now waiting for results queue to drain 46400 1727204550.96617: waiting for pending results... 46400 1727204550.96913: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204550.97073: in run() - task 0affcd87-79f5-1303-fda8-000000000d27 46400 1727204550.97094: variable 'ansible_search_path' from source: unknown 46400 1727204550.97102: variable 'ansible_search_path' from source: unknown 46400 1727204550.97146: calling self._execute() 46400 1727204550.97256: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204550.97271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204550.97290: variable 'omit' from source: magic vars 46400 1727204550.97677: variable 'ansible_distribution_major_version' from source: facts 46400 1727204550.97696: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204550.97812: variable 'network_state' from source: role '' defaults 46400 1727204550.97831: Evaluated conditional (network_state != {}): False 46400 1727204550.97839: when evaluation is False, skipping this task 46400 1727204550.97845: _execute() done 46400 1727204550.97853: dumping result to json 46400 1727204550.97860: done dumping result, returning 46400 1727204550.97874: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000000d27] 46400 1727204550.97885: sending task result for task 0affcd87-79f5-1303-fda8-000000000d27 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204550.98071: no more pending results, returning what we have 46400 1727204550.98077: results queue empty 46400 1727204550.98078: checking for any_errors_fatal 46400 1727204550.98094: done checking for any_errors_fatal 46400 1727204550.98095: checking for max_fail_percentage 46400 1727204550.98097: done checking for max_fail_percentage 46400 1727204550.98098: checking to see if all hosts have failed and the running result is not ok 46400 1727204550.98099: done checking to see if all hosts have failed 46400 1727204550.98100: getting the remaining hosts for this loop 46400 1727204550.98103: done getting the remaining hosts for this loop 46400 1727204550.98107: getting the next task for host managed-node2 46400 1727204550.98121: done getting next task for host managed-node2 46400 1727204550.98126: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204550.98132: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204550.98155: getting variables 46400 1727204550.98157: in VariableManager get_vars() 46400 1727204550.98205: Calling all_inventory to load vars for managed-node2 46400 1727204550.98210: Calling groups_inventory to load vars for managed-node2 46400 1727204550.98213: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204550.98227: Calling all_plugins_play to load vars for managed-node2 46400 1727204550.98230: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204550.98234: Calling groups_plugins_play to load vars for managed-node2 46400 1727204550.99294: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d27 46400 1727204550.99297: WORKER PROCESS EXITING 46400 1727204551.00406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.03622: done with get_vars() 46400 1727204551.03661: done getting variables 46400 1727204551.03723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.075) 0:00:41.322 ***** 46400 1727204551.03768: entering _queue_task() for managed-node2/debug 46400 1727204551.04121: worker is 1 (out of 1 available) 46400 1727204551.04134: exiting _queue_task() for managed-node2/debug 46400 1727204551.04147: done queuing things up, now waiting for results queue to drain 46400 1727204551.04149: waiting for pending results... 46400 1727204551.04455: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204551.04625: in run() - task 0affcd87-79f5-1303-fda8-000000000d28 46400 1727204551.04647: variable 'ansible_search_path' from source: unknown 46400 1727204551.04655: variable 'ansible_search_path' from source: unknown 46400 1727204551.04702: calling self._execute() 46400 1727204551.06788: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.06804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.06820: variable 'omit' from source: magic vars 46400 1727204551.07402: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.07425: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.07436: variable 'omit' from source: magic vars 46400 1727204551.07527: variable 'omit' from source: magic vars 46400 1727204551.07570: variable 'omit' from source: magic vars 46400 1727204551.07620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204551.07669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204551.07701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204551.07726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.07744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.07780: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204551.07788: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.07795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.07900: Set connection var ansible_shell_type to sh 46400 1727204551.07915: Set connection var ansible_shell_executable to /bin/sh 46400 1727204551.07925: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204551.07935: Set connection var ansible_connection to ssh 46400 1727204551.07943: Set connection var ansible_pipelining to False 46400 1727204551.07956: Set connection var ansible_timeout to 10 46400 1727204551.07988: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.07995: variable 'ansible_connection' from source: unknown 46400 1727204551.08001: variable 'ansible_module_compression' from source: unknown 46400 1727204551.08007: variable 'ansible_shell_type' from source: unknown 46400 1727204551.08013: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.08019: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.08025: variable 'ansible_pipelining' from source: unknown 46400 1727204551.08031: variable 'ansible_timeout' from source: unknown 46400 1727204551.08038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.08190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204551.08209: variable 'omit' from source: magic vars 46400 1727204551.08218: starting attempt loop 46400 1727204551.08224: running the handler 46400 1727204551.08365: variable '__network_connections_result' from source: set_fact 46400 1727204551.08426: handler run complete 46400 1727204551.08448: attempt loop complete, returning result 46400 1727204551.08455: _execute() done 46400 1727204551.08462: dumping result to json 46400 1727204551.08471: done dumping result, returning 46400 1727204551.08483: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000000d28] 46400 1727204551.08494: sending task result for task 0affcd87-79f5-1303-fda8-000000000d28 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active" ] } 46400 1727204551.08674: no more pending results, returning what we have 46400 1727204551.08679: results queue empty 46400 1727204551.08681: checking for any_errors_fatal 46400 1727204551.08688: done checking for any_errors_fatal 46400 1727204551.08689: checking for max_fail_percentage 46400 1727204551.08691: done checking for max_fail_percentage 46400 1727204551.08692: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.08693: done checking to see if all hosts have failed 46400 1727204551.08694: getting the remaining hosts for this loop 46400 1727204551.08696: done getting the remaining hosts for this loop 46400 1727204551.08700: getting the next task for host managed-node2 46400 1727204551.08709: done getting next task for host managed-node2 46400 1727204551.08714: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204551.08720: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.08732: getting variables 46400 1727204551.08734: in VariableManager get_vars() 46400 1727204551.08776: Calling all_inventory to load vars for managed-node2 46400 1727204551.08779: Calling groups_inventory to load vars for managed-node2 46400 1727204551.08781: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.08792: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.08795: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.08798: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.10082: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d28 46400 1727204551.10086: WORKER PROCESS EXITING 46400 1727204551.10511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.13718: done with get_vars() 46400 1727204551.13755: done getting variables 46400 1727204551.14026: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.103) 0:00:41.425 ***** 46400 1727204551.14078: entering _queue_task() for managed-node2/debug 46400 1727204551.14832: worker is 1 (out of 1 available) 46400 1727204551.14847: exiting _queue_task() for managed-node2/debug 46400 1727204551.14859: done queuing things up, now waiting for results queue to drain 46400 1727204551.14861: waiting for pending results... 46400 1727204551.16007: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204551.16178: in run() - task 0affcd87-79f5-1303-fda8-000000000d29 46400 1727204551.16199: variable 'ansible_search_path' from source: unknown 46400 1727204551.16206: variable 'ansible_search_path' from source: unknown 46400 1727204551.16250: calling self._execute() 46400 1727204551.16350: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.16365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.16379: variable 'omit' from source: magic vars 46400 1727204551.16748: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.16767: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.16780: variable 'omit' from source: magic vars 46400 1727204551.16850: variable 'omit' from source: magic vars 46400 1727204551.16889: variable 'omit' from source: magic vars 46400 1727204551.16939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204551.16982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204551.17016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204551.17038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.17054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.17090: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204551.17099: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.17106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.17208: Set connection var ansible_shell_type to sh 46400 1727204551.17229: Set connection var ansible_shell_executable to /bin/sh 46400 1727204551.17241: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204551.17252: Set connection var ansible_connection to ssh 46400 1727204551.17261: Set connection var ansible_pipelining to False 46400 1727204551.17274: Set connection var ansible_timeout to 10 46400 1727204551.17303: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.17310: variable 'ansible_connection' from source: unknown 46400 1727204551.17317: variable 'ansible_module_compression' from source: unknown 46400 1727204551.17322: variable 'ansible_shell_type' from source: unknown 46400 1727204551.17332: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.17339: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.17347: variable 'ansible_pipelining' from source: unknown 46400 1727204551.17354: variable 'ansible_timeout' from source: unknown 46400 1727204551.17361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.17514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204551.17529: variable 'omit' from source: magic vars 46400 1727204551.17539: starting attempt loop 46400 1727204551.17550: running the handler 46400 1727204551.17601: variable '__network_connections_result' from source: set_fact 46400 1727204551.17692: variable '__network_connections_result' from source: set_fact 46400 1727204551.17803: handler run complete 46400 1727204551.17833: attempt loop complete, returning result 46400 1727204551.17840: _execute() done 46400 1727204551.17847: dumping result to json 46400 1727204551.17855: done dumping result, returning 46400 1727204551.17871: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000000d29] 46400 1727204551.17881: sending task result for task 0affcd87-79f5-1303-fda8-000000000d29 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 985b3c37-4ecd-406f-bfdf-7018e6e80d39 skipped because already active" ] } } 46400 1727204551.18088: no more pending results, returning what we have 46400 1727204551.18093: results queue empty 46400 1727204551.18094: checking for any_errors_fatal 46400 1727204551.18102: done checking for any_errors_fatal 46400 1727204551.18103: checking for max_fail_percentage 46400 1727204551.18105: done checking for max_fail_percentage 46400 1727204551.18106: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.18107: done checking to see if all hosts have failed 46400 1727204551.18108: getting the remaining hosts for this loop 46400 1727204551.18110: done getting the remaining hosts for this loop 46400 1727204551.18113: getting the next task for host managed-node2 46400 1727204551.18122: done getting next task for host managed-node2 46400 1727204551.18126: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204551.18131: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.18144: getting variables 46400 1727204551.18146: in VariableManager get_vars() 46400 1727204551.18188: Calling all_inventory to load vars for managed-node2 46400 1727204551.18191: Calling groups_inventory to load vars for managed-node2 46400 1727204551.18193: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.18206: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.18216: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.18219: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.19669: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d29 46400 1727204551.19674: WORKER PROCESS EXITING 46400 1727204551.20107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.21756: done with get_vars() 46400 1727204551.21792: done getting variables 46400 1727204551.21849: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.078) 0:00:41.503 ***** 46400 1727204551.21890: entering _queue_task() for managed-node2/debug 46400 1727204551.22240: worker is 1 (out of 1 available) 46400 1727204551.22256: exiting _queue_task() for managed-node2/debug 46400 1727204551.22272: done queuing things up, now waiting for results queue to drain 46400 1727204551.22274: waiting for pending results... 46400 1727204551.22979: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204551.23307: in run() - task 0affcd87-79f5-1303-fda8-000000000d2a 46400 1727204551.23390: variable 'ansible_search_path' from source: unknown 46400 1727204551.23399: variable 'ansible_search_path' from source: unknown 46400 1727204551.23440: calling self._execute() 46400 1727204551.23584: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.23710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.23726: variable 'omit' from source: magic vars 46400 1727204551.24516: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.24534: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.24777: variable 'network_state' from source: role '' defaults 46400 1727204551.24884: Evaluated conditional (network_state != {}): False 46400 1727204551.24898: when evaluation is False, skipping this task 46400 1727204551.24907: _execute() done 46400 1727204551.24914: dumping result to json 46400 1727204551.24921: done dumping result, returning 46400 1727204551.24932: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000000d2a] 46400 1727204551.24942: sending task result for task 0affcd87-79f5-1303-fda8-000000000d2a skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204551.25099: no more pending results, returning what we have 46400 1727204551.25105: results queue empty 46400 1727204551.25106: checking for any_errors_fatal 46400 1727204551.25122: done checking for any_errors_fatal 46400 1727204551.25123: checking for max_fail_percentage 46400 1727204551.25125: done checking for max_fail_percentage 46400 1727204551.25126: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.25127: done checking to see if all hosts have failed 46400 1727204551.25128: getting the remaining hosts for this loop 46400 1727204551.25130: done getting the remaining hosts for this loop 46400 1727204551.25135: getting the next task for host managed-node2 46400 1727204551.25146: done getting next task for host managed-node2 46400 1727204551.25151: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204551.25156: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.25183: getting variables 46400 1727204551.25185: in VariableManager get_vars() 46400 1727204551.25229: Calling all_inventory to load vars for managed-node2 46400 1727204551.25232: Calling groups_inventory to load vars for managed-node2 46400 1727204551.25235: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.25249: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.25252: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.25255: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.26689: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d2a 46400 1727204551.26694: WORKER PROCESS EXITING 46400 1727204551.27534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.29975: done with get_vars() 46400 1727204551.30012: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.082) 0:00:41.585 ***** 46400 1727204551.30123: entering _queue_task() for managed-node2/ping 46400 1727204551.30482: worker is 1 (out of 1 available) 46400 1727204551.30496: exiting _queue_task() for managed-node2/ping 46400 1727204551.30511: done queuing things up, now waiting for results queue to drain 46400 1727204551.30512: waiting for pending results... 46400 1727204551.30811: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204551.30968: in run() - task 0affcd87-79f5-1303-fda8-000000000d2b 46400 1727204551.30988: variable 'ansible_search_path' from source: unknown 46400 1727204551.30995: variable 'ansible_search_path' from source: unknown 46400 1727204551.31031: calling self._execute() 46400 1727204551.31138: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.31152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.31174: variable 'omit' from source: magic vars 46400 1727204551.31559: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.31579: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.31589: variable 'omit' from source: magic vars 46400 1727204551.31670: variable 'omit' from source: magic vars 46400 1727204551.31711: variable 'omit' from source: magic vars 46400 1727204551.31759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204551.31800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204551.31828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204551.31850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.31870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204551.31904: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204551.31913: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.31921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.32021: Set connection var ansible_shell_type to sh 46400 1727204551.32037: Set connection var ansible_shell_executable to /bin/sh 46400 1727204551.32050: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204551.32060: Set connection var ansible_connection to ssh 46400 1727204551.32072: Set connection var ansible_pipelining to False 46400 1727204551.32081: Set connection var ansible_timeout to 10 46400 1727204551.32112: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.32121: variable 'ansible_connection' from source: unknown 46400 1727204551.32128: variable 'ansible_module_compression' from source: unknown 46400 1727204551.32135: variable 'ansible_shell_type' from source: unknown 46400 1727204551.32142: variable 'ansible_shell_executable' from source: unknown 46400 1727204551.32153: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.32161: variable 'ansible_pipelining' from source: unknown 46400 1727204551.32171: variable 'ansible_timeout' from source: unknown 46400 1727204551.32178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.32392: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204551.32407: variable 'omit' from source: magic vars 46400 1727204551.32415: starting attempt loop 46400 1727204551.32421: running the handler 46400 1727204551.32440: _low_level_execute_command(): starting 46400 1727204551.32451: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204551.33220: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.33242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.33258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.33281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.33326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.33339: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.33357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.33380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.33393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.33405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.33418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.33434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.33452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.33471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.33484: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.33497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.33582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.33607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.33625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.33705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.35394: stdout chunk (state=3): >>>/root <<< 46400 1727204551.35600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204551.35604: stdout chunk (state=3): >>><<< 46400 1727204551.35606: stderr chunk (state=3): >>><<< 46400 1727204551.35730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204551.35734: _low_level_execute_command(): starting 46400 1727204551.35738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403 `" && echo ansible-tmp-1727204551.3562849-49417-181106531933403="` echo /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403 `" ) && sleep 0' 46400 1727204551.38217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.38889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.38900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.38914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.38960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.38972: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.38983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.38996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.39004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.39010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.39018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.39027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.39038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.39046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.39052: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.39061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.39143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.39159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.39167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.39237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.41112: stdout chunk (state=3): >>>ansible-tmp-1727204551.3562849-49417-181106531933403=/root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403 <<< 46400 1727204551.41310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204551.41314: stdout chunk (state=3): >>><<< 46400 1727204551.41318: stderr chunk (state=3): >>><<< 46400 1727204551.41364: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204551.3562849-49417-181106531933403=/root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204551.41418: variable 'ansible_module_compression' from source: unknown 46400 1727204551.41459: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204551.41499: variable 'ansible_facts' from source: unknown 46400 1727204551.41574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/AnsiballZ_ping.py 46400 1727204551.42041: Sending initial data 46400 1727204551.42044: Sent initial data (153 bytes) 46400 1727204551.44973: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.44987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.44997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.45029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.45075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.45083: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.45110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.45118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.45125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.45134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.45143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.45157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.45168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.45176: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.45192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.45284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.45289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.45298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.45359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.47059: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204551.47092: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204551.47142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpj8l5psc6 /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/AnsiballZ_ping.py <<< 46400 1727204551.47179: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204551.48783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204551.49073: stderr chunk (state=3): >>><<< 46400 1727204551.49076: stdout chunk (state=3): >>><<< 46400 1727204551.49079: done transferring module to remote 46400 1727204551.49081: _low_level_execute_command(): starting 46400 1727204551.49087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/ /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/AnsiballZ_ping.py && sleep 0' 46400 1727204551.49769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.49787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.49802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.49818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.49869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.49882: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.49898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.49916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.49926: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.49938: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.49953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.49968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.49984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.49995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.50006: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.50020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.50108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.50130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.50144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.50220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.52074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204551.52078: stdout chunk (state=3): >>><<< 46400 1727204551.52081: stderr chunk (state=3): >>><<< 46400 1727204551.52190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204551.52194: _low_level_execute_command(): starting 46400 1727204551.52197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/AnsiballZ_ping.py && sleep 0' 46400 1727204551.53592: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.53603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.53613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.53627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.53675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.53781: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.53791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.53805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.53813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.53820: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.53831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.53837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.53849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.53857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.53868: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.53878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.53948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.53985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.53995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.54238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.67319: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204551.68394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204551.68399: stdout chunk (state=3): >>><<< 46400 1727204551.68404: stderr chunk (state=3): >>><<< 46400 1727204551.68423: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204551.68453: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204551.68464: _low_level_execute_command(): starting 46400 1727204551.68478: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204551.3562849-49417-181106531933403/ > /dev/null 2>&1 && sleep 0' 46400 1727204551.69819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204551.69836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.69852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.69881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.69932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.69945: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204551.69959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.69984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204551.69995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204551.70009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204551.70020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204551.70035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204551.70054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204551.70068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204551.70085: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204551.70099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204551.70179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204551.70206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204551.70229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204551.70303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204551.72088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204551.72155: stderr chunk (state=3): >>><<< 46400 1727204551.72159: stdout chunk (state=3): >>><<< 46400 1727204551.72199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204551.72202: handler run complete 46400 1727204551.72243: attempt loop complete, returning result 46400 1727204551.72246: _execute() done 46400 1727204551.72248: dumping result to json 46400 1727204551.72250: done dumping result, returning 46400 1727204551.72252: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000000d2b] 46400 1727204551.72254: sending task result for task 0affcd87-79f5-1303-fda8-000000000d2b 46400 1727204551.72689: done sending task result for task 0affcd87-79f5-1303-fda8-000000000d2b 46400 1727204551.72693: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204551.72755: no more pending results, returning what we have 46400 1727204551.72762: results queue empty 46400 1727204551.72765: checking for any_errors_fatal 46400 1727204551.72771: done checking for any_errors_fatal 46400 1727204551.72772: checking for max_fail_percentage 46400 1727204551.72774: done checking for max_fail_percentage 46400 1727204551.72774: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.72775: done checking to see if all hosts have failed 46400 1727204551.72776: getting the remaining hosts for this loop 46400 1727204551.72777: done getting the remaining hosts for this loop 46400 1727204551.72783: getting the next task for host managed-node2 46400 1727204551.72794: done getting next task for host managed-node2 46400 1727204551.72796: ^ task is: TASK: meta (role_complete) 46400 1727204551.72801: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.72813: getting variables 46400 1727204551.72815: in VariableManager get_vars() 46400 1727204551.72852: Calling all_inventory to load vars for managed-node2 46400 1727204551.72855: Calling groups_inventory to load vars for managed-node2 46400 1727204551.72857: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.72875: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.72878: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.72882: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.81878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.83570: done with get_vars() 46400 1727204551.83601: done getting variables 46400 1727204551.83685: done queuing things up, now waiting for results queue to drain 46400 1727204551.83688: results queue empty 46400 1727204551.83689: checking for any_errors_fatal 46400 1727204551.83692: done checking for any_errors_fatal 46400 1727204551.83693: checking for max_fail_percentage 46400 1727204551.83694: done checking for max_fail_percentage 46400 1727204551.83695: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.83696: done checking to see if all hosts have failed 46400 1727204551.83696: getting the remaining hosts for this loop 46400 1727204551.83697: done getting the remaining hosts for this loop 46400 1727204551.83706: getting the next task for host managed-node2 46400 1727204551.83712: done getting next task for host managed-node2 46400 1727204551.83714: ^ task is: TASK: Asserts 46400 1727204551.83716: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.83719: getting variables 46400 1727204551.83720: in VariableManager get_vars() 46400 1727204551.83732: Calling all_inventory to load vars for managed-node2 46400 1727204551.83734: Calling groups_inventory to load vars for managed-node2 46400 1727204551.83737: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.83742: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.83745: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.83747: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.84963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.86769: done with get_vars() 46400 1727204551.86792: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.567) 0:00:42.153 ***** 46400 1727204551.86875: entering _queue_task() for managed-node2/include_tasks 46400 1727204551.87228: worker is 1 (out of 1 available) 46400 1727204551.87241: exiting _queue_task() for managed-node2/include_tasks 46400 1727204551.87254: done queuing things up, now waiting for results queue to drain 46400 1727204551.87256: waiting for pending results... 46400 1727204551.87563: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204551.87705: in run() - task 0affcd87-79f5-1303-fda8-000000000a4e 46400 1727204551.87726: variable 'ansible_search_path' from source: unknown 46400 1727204551.87734: variable 'ansible_search_path' from source: unknown 46400 1727204551.87790: variable 'lsr_assert' from source: include params 46400 1727204551.88026: variable 'lsr_assert' from source: include params 46400 1727204551.88110: variable 'omit' from source: magic vars 46400 1727204551.88282: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.88303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.88321: variable 'omit' from source: magic vars 46400 1727204551.88585: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.88602: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.88613: variable 'item' from source: unknown 46400 1727204551.88687: variable 'item' from source: unknown 46400 1727204551.88733: variable 'item' from source: unknown 46400 1727204551.88804: variable 'item' from source: unknown 46400 1727204551.89010: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.89023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.89039: variable 'omit' from source: magic vars 46400 1727204551.89208: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.89219: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.89228: variable 'item' from source: unknown 46400 1727204551.89302: variable 'item' from source: unknown 46400 1727204551.89339: variable 'item' from source: unknown 46400 1727204551.89411: variable 'item' from source: unknown 46400 1727204551.89510: dumping result to json 46400 1727204551.89519: done dumping result, returning 46400 1727204551.89529: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-000000000a4e] 46400 1727204551.89538: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4e 46400 1727204551.89631: no more pending results, returning what we have 46400 1727204551.89639: in VariableManager get_vars() 46400 1727204551.89690: Calling all_inventory to load vars for managed-node2 46400 1727204551.89693: Calling groups_inventory to load vars for managed-node2 46400 1727204551.89697: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.89712: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.89716: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.89719: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.90783: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4e 46400 1727204551.90787: WORKER PROCESS EXITING 46400 1727204551.91530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.93240: done with get_vars() 46400 1727204551.93269: variable 'ansible_search_path' from source: unknown 46400 1727204551.93271: variable 'ansible_search_path' from source: unknown 46400 1727204551.93316: variable 'ansible_search_path' from source: unknown 46400 1727204551.93318: variable 'ansible_search_path' from source: unknown 46400 1727204551.93348: we have included files to process 46400 1727204551.93349: generating all_blocks data 46400 1727204551.93351: done generating all_blocks data 46400 1727204551.93357: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204551.93358: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204551.93363: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204551.93484: in VariableManager get_vars() 46400 1727204551.93507: done with get_vars() 46400 1727204551.93625: done processing included file 46400 1727204551.93627: iterating over new_blocks loaded from include file 46400 1727204551.93628: in VariableManager get_vars() 46400 1727204551.93645: done with get_vars() 46400 1727204551.93646: filtering new block on tags 46400 1727204551.93686: done filtering new block on tags 46400 1727204551.93689: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item=tasks/assert_device_present.yml) 46400 1727204551.93694: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204551.93695: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204551.93698: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 46400 1727204551.93802: in VariableManager get_vars() 46400 1727204551.93821: done with get_vars() 46400 1727204551.94047: done processing included file 46400 1727204551.94049: iterating over new_blocks loaded from include file 46400 1727204551.94050: in VariableManager get_vars() 46400 1727204551.94068: done with get_vars() 46400 1727204551.94070: filtering new block on tags 46400 1727204551.94116: done filtering new block on tags 46400 1727204551.94118: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 46400 1727204551.94122: extending task lists for all hosts with included blocks 46400 1727204551.95082: done extending task lists 46400 1727204551.95083: done processing included files 46400 1727204551.95084: results queue empty 46400 1727204551.95085: checking for any_errors_fatal 46400 1727204551.95087: done checking for any_errors_fatal 46400 1727204551.95088: checking for max_fail_percentage 46400 1727204551.95089: done checking for max_fail_percentage 46400 1727204551.95090: checking to see if all hosts have failed and the running result is not ok 46400 1727204551.95091: done checking to see if all hosts have failed 46400 1727204551.95092: getting the remaining hosts for this loop 46400 1727204551.95093: done getting the remaining hosts for this loop 46400 1727204551.95095: getting the next task for host managed-node2 46400 1727204551.95100: done getting next task for host managed-node2 46400 1727204551.95102: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204551.95105: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204551.95114: getting variables 46400 1727204551.95115: in VariableManager get_vars() 46400 1727204551.95125: Calling all_inventory to load vars for managed-node2 46400 1727204551.95127: Calling groups_inventory to load vars for managed-node2 46400 1727204551.95129: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.95135: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.95137: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.95140: Calling groups_plugins_play to load vars for managed-node2 46400 1727204551.96466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204551.98142: done with get_vars() 46400 1727204551.98179: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:02:31 -0400 (0:00:00.113) 0:00:42.267 ***** 46400 1727204551.98259: entering _queue_task() for managed-node2/include_tasks 46400 1727204551.98612: worker is 1 (out of 1 available) 46400 1727204551.98625: exiting _queue_task() for managed-node2/include_tasks 46400 1727204551.98638: done queuing things up, now waiting for results queue to drain 46400 1727204551.98640: waiting for pending results... 46400 1727204551.98930: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204551.99052: in run() - task 0affcd87-79f5-1303-fda8-000000000e86 46400 1727204551.99078: variable 'ansible_search_path' from source: unknown 46400 1727204551.99090: variable 'ansible_search_path' from source: unknown 46400 1727204551.99131: calling self._execute() 46400 1727204551.99237: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204551.99249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204551.99268: variable 'omit' from source: magic vars 46400 1727204551.99674: variable 'ansible_distribution_major_version' from source: facts 46400 1727204551.99692: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204551.99704: _execute() done 46400 1727204551.99712: dumping result to json 46400 1727204551.99721: done dumping result, returning 46400 1727204551.99733: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-000000000e86] 46400 1727204551.99746: sending task result for task 0affcd87-79f5-1303-fda8-000000000e86 46400 1727204551.99891: no more pending results, returning what we have 46400 1727204551.99896: in VariableManager get_vars() 46400 1727204551.99941: Calling all_inventory to load vars for managed-node2 46400 1727204551.99944: Calling groups_inventory to load vars for managed-node2 46400 1727204551.99947: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204551.99967: Calling all_plugins_play to load vars for managed-node2 46400 1727204551.99971: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204551.99974: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.01210: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e86 46400 1727204552.01215: WORKER PROCESS EXITING 46400 1727204552.01924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.03080: done with get_vars() 46400 1727204552.03097: variable 'ansible_search_path' from source: unknown 46400 1727204552.03099: variable 'ansible_search_path' from source: unknown 46400 1727204552.03107: variable 'item' from source: include params 46400 1727204552.03199: variable 'item' from source: include params 46400 1727204552.03226: we have included files to process 46400 1727204552.03227: generating all_blocks data 46400 1727204552.03229: done generating all_blocks data 46400 1727204552.03230: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204552.03230: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204552.03231: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204552.03371: done processing included file 46400 1727204552.03373: iterating over new_blocks loaded from include file 46400 1727204552.03374: in VariableManager get_vars() 46400 1727204552.03387: done with get_vars() 46400 1727204552.03388: filtering new block on tags 46400 1727204552.03407: done filtering new block on tags 46400 1727204552.03409: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204552.03413: extending task lists for all hosts with included blocks 46400 1727204552.03512: done extending task lists 46400 1727204552.03513: done processing included files 46400 1727204552.03514: results queue empty 46400 1727204552.03514: checking for any_errors_fatal 46400 1727204552.03518: done checking for any_errors_fatal 46400 1727204552.03518: checking for max_fail_percentage 46400 1727204552.03520: done checking for max_fail_percentage 46400 1727204552.03520: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.03521: done checking to see if all hosts have failed 46400 1727204552.03522: getting the remaining hosts for this loop 46400 1727204552.03523: done getting the remaining hosts for this loop 46400 1727204552.03524: getting the next task for host managed-node2 46400 1727204552.03528: done getting next task for host managed-node2 46400 1727204552.03529: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204552.03532: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.03533: getting variables 46400 1727204552.03534: in VariableManager get_vars() 46400 1727204552.03541: Calling all_inventory to load vars for managed-node2 46400 1727204552.03543: Calling groups_inventory to load vars for managed-node2 46400 1727204552.03544: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.03549: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.03550: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.03552: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.04581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.06274: done with get_vars() 46400 1727204552.06301: done getting variables 46400 1727204552.06467: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.082) 0:00:42.349 ***** 46400 1727204552.06516: entering _queue_task() for managed-node2/stat 46400 1727204552.06958: worker is 1 (out of 1 available) 46400 1727204552.06973: exiting _queue_task() for managed-node2/stat 46400 1727204552.06988: done queuing things up, now waiting for results queue to drain 46400 1727204552.06990: waiting for pending results... 46400 1727204552.07182: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204552.07279: in run() - task 0affcd87-79f5-1303-fda8-000000000ef5 46400 1727204552.07289: variable 'ansible_search_path' from source: unknown 46400 1727204552.07292: variable 'ansible_search_path' from source: unknown 46400 1727204552.07323: calling self._execute() 46400 1727204552.07400: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.07403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.07412: variable 'omit' from source: magic vars 46400 1727204552.07700: variable 'ansible_distribution_major_version' from source: facts 46400 1727204552.07710: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204552.07716: variable 'omit' from source: magic vars 46400 1727204552.07759: variable 'omit' from source: magic vars 46400 1727204552.07829: variable 'interface' from source: play vars 46400 1727204552.07842: variable 'omit' from source: magic vars 46400 1727204552.07882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204552.07908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204552.07926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204552.07939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.07949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.07977: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204552.07980: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.07983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.08051: Set connection var ansible_shell_type to sh 46400 1727204552.08059: Set connection var ansible_shell_executable to /bin/sh 46400 1727204552.08066: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204552.08070: Set connection var ansible_connection to ssh 46400 1727204552.08077: Set connection var ansible_pipelining to False 46400 1727204552.08082: Set connection var ansible_timeout to 10 46400 1727204552.08101: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.08106: variable 'ansible_connection' from source: unknown 46400 1727204552.08109: variable 'ansible_module_compression' from source: unknown 46400 1727204552.08111: variable 'ansible_shell_type' from source: unknown 46400 1727204552.08114: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.08116: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.08118: variable 'ansible_pipelining' from source: unknown 46400 1727204552.08121: variable 'ansible_timeout' from source: unknown 46400 1727204552.08123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.08273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204552.08282: variable 'omit' from source: magic vars 46400 1727204552.08287: starting attempt loop 46400 1727204552.08291: running the handler 46400 1727204552.08303: _low_level_execute_command(): starting 46400 1727204552.08311: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204552.08969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.09009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.09047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.09084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.10714: stdout chunk (state=3): >>>/root <<< 46400 1727204552.10819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.10877: stderr chunk (state=3): >>><<< 46400 1727204552.10881: stdout chunk (state=3): >>><<< 46400 1727204552.10901: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.10918: _low_level_execute_command(): starting 46400 1727204552.10924: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439 `" && echo ansible-tmp-1727204552.1090372-49449-44935895495439="` echo /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439 `" ) && sleep 0' 46400 1727204552.11378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.11387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.11396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.11407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.11434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.11441: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.11450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.11473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.11476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.11479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.11481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.11491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.11501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.11509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.11514: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.11519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.11587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.11594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.11597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.11654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.13499: stdout chunk (state=3): >>>ansible-tmp-1727204552.1090372-49449-44935895495439=/root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439 <<< 46400 1727204552.13610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.13675: stderr chunk (state=3): >>><<< 46400 1727204552.13679: stdout chunk (state=3): >>><<< 46400 1727204552.13697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204552.1090372-49449-44935895495439=/root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.13741: variable 'ansible_module_compression' from source: unknown 46400 1727204552.13797: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204552.13828: variable 'ansible_facts' from source: unknown 46400 1727204552.13894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/AnsiballZ_stat.py 46400 1727204552.14005: Sending initial data 46400 1727204552.14008: Sent initial data (152 bytes) 46400 1727204552.14716: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.14723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.14758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.14768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204552.14776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.14786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.14792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.14850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.14857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.14874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.14921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.16624: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204552.16660: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204552.16694: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpy7m211lr /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/AnsiballZ_stat.py <<< 46400 1727204552.16729: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204552.17506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.17620: stderr chunk (state=3): >>><<< 46400 1727204552.17624: stdout chunk (state=3): >>><<< 46400 1727204552.17642: done transferring module to remote 46400 1727204552.17656: _low_level_execute_command(): starting 46400 1727204552.17660: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/ /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/AnsiballZ_stat.py && sleep 0' 46400 1727204552.18139: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.18142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.18185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.18188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204552.18191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.18198: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.18240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.18248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.18302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.20002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.20068: stderr chunk (state=3): >>><<< 46400 1727204552.20071: stdout chunk (state=3): >>><<< 46400 1727204552.20085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.20088: _low_level_execute_command(): starting 46400 1727204552.20093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/AnsiballZ_stat.py && sleep 0' 46400 1727204552.20565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.20570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.20605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.20609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204552.20612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.20669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.20676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.20677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.20719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.33981: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32681, "dev": 21, "nlink": 1, "atime": 1727204544.3262405, "mtime": 1727204544.3262405, "ctime": 1727204544.3262405, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204552.34984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204552.35046: stderr chunk (state=3): >>><<< 46400 1727204552.35050: stdout chunk (state=3): >>><<< 46400 1727204552.35069: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32681, "dev": 21, "nlink": 1, "atime": 1727204544.3262405, "mtime": 1727204544.3262405, "ctime": 1727204544.3262405, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204552.35116: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204552.35124: _low_level_execute_command(): starting 46400 1727204552.35131: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204552.1090372-49449-44935895495439/ > /dev/null 2>&1 && sleep 0' 46400 1727204552.35614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.35620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.35663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204552.35669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.35678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.35683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.35691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.35700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204552.35709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.35758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.35778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.35823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.37622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.37685: stderr chunk (state=3): >>><<< 46400 1727204552.37689: stdout chunk (state=3): >>><<< 46400 1727204552.37704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.37710: handler run complete 46400 1727204552.37742: attempt loop complete, returning result 46400 1727204552.37745: _execute() done 46400 1727204552.37748: dumping result to json 46400 1727204552.37753: done dumping result, returning 46400 1727204552.37762: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000000ef5] 46400 1727204552.37767: sending task result for task 0affcd87-79f5-1303-fda8-000000000ef5 46400 1727204552.37884: done sending task result for task 0affcd87-79f5-1303-fda8-000000000ef5 46400 1727204552.37887: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204544.3262405, "block_size": 4096, "blocks": 0, "ctime": 1727204544.3262405, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32681, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204544.3262405, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 46400 1727204552.37999: no more pending results, returning what we have 46400 1727204552.38003: results queue empty 46400 1727204552.38004: checking for any_errors_fatal 46400 1727204552.38005: done checking for any_errors_fatal 46400 1727204552.38006: checking for max_fail_percentage 46400 1727204552.38008: done checking for max_fail_percentage 46400 1727204552.38008: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.38009: done checking to see if all hosts have failed 46400 1727204552.38010: getting the remaining hosts for this loop 46400 1727204552.38012: done getting the remaining hosts for this loop 46400 1727204552.38015: getting the next task for host managed-node2 46400 1727204552.38024: done getting next task for host managed-node2 46400 1727204552.38026: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 46400 1727204552.38030: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.38033: getting variables 46400 1727204552.38035: in VariableManager get_vars() 46400 1727204552.38070: Calling all_inventory to load vars for managed-node2 46400 1727204552.38073: Calling groups_inventory to load vars for managed-node2 46400 1727204552.38076: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.38086: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.38089: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.38091: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.39033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.39982: done with get_vars() 46400 1727204552.40000: done getting variables 46400 1727204552.40046: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204552.40140: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.336) 0:00:42.686 ***** 46400 1727204552.40171: entering _queue_task() for managed-node2/assert 46400 1727204552.40416: worker is 1 (out of 1 available) 46400 1727204552.40430: exiting _queue_task() for managed-node2/assert 46400 1727204552.40445: done queuing things up, now waiting for results queue to drain 46400 1727204552.40447: waiting for pending results... 46400 1727204552.40638: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 46400 1727204552.40726: in run() - task 0affcd87-79f5-1303-fda8-000000000e87 46400 1727204552.40735: variable 'ansible_search_path' from source: unknown 46400 1727204552.40738: variable 'ansible_search_path' from source: unknown 46400 1727204552.40774: calling self._execute() 46400 1727204552.40850: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.40854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.40867: variable 'omit' from source: magic vars 46400 1727204552.41148: variable 'ansible_distribution_major_version' from source: facts 46400 1727204552.41158: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204552.41167: variable 'omit' from source: magic vars 46400 1727204552.41200: variable 'omit' from source: magic vars 46400 1727204552.41275: variable 'interface' from source: play vars 46400 1727204552.41289: variable 'omit' from source: magic vars 46400 1727204552.41329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204552.41363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204552.41385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204552.41399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.41409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.41434: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204552.41437: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.41441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.41512: Set connection var ansible_shell_type to sh 46400 1727204552.41521: Set connection var ansible_shell_executable to /bin/sh 46400 1727204552.41524: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204552.41530: Set connection var ansible_connection to ssh 46400 1727204552.41534: Set connection var ansible_pipelining to False 46400 1727204552.41540: Set connection var ansible_timeout to 10 46400 1727204552.41559: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.41562: variable 'ansible_connection' from source: unknown 46400 1727204552.41568: variable 'ansible_module_compression' from source: unknown 46400 1727204552.41571: variable 'ansible_shell_type' from source: unknown 46400 1727204552.41574: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.41576: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.41580: variable 'ansible_pipelining' from source: unknown 46400 1727204552.41582: variable 'ansible_timeout' from source: unknown 46400 1727204552.41586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.41702: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204552.41711: variable 'omit' from source: magic vars 46400 1727204552.41716: starting attempt loop 46400 1727204552.41719: running the handler 46400 1727204552.41820: variable 'interface_stat' from source: set_fact 46400 1727204552.41837: Evaluated conditional (interface_stat.stat.exists): True 46400 1727204552.41847: handler run complete 46400 1727204552.41855: attempt loop complete, returning result 46400 1727204552.41858: _execute() done 46400 1727204552.41862: dumping result to json 46400 1727204552.41870: done dumping result, returning 46400 1727204552.41873: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [0affcd87-79f5-1303-fda8-000000000e87] 46400 1727204552.41879: sending task result for task 0affcd87-79f5-1303-fda8-000000000e87 46400 1727204552.41967: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e87 46400 1727204552.41970: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204552.42020: no more pending results, returning what we have 46400 1727204552.42024: results queue empty 46400 1727204552.42025: checking for any_errors_fatal 46400 1727204552.42038: done checking for any_errors_fatal 46400 1727204552.42039: checking for max_fail_percentage 46400 1727204552.42041: done checking for max_fail_percentage 46400 1727204552.42042: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.42043: done checking to see if all hosts have failed 46400 1727204552.42044: getting the remaining hosts for this loop 46400 1727204552.42045: done getting the remaining hosts for this loop 46400 1727204552.42049: getting the next task for host managed-node2 46400 1727204552.42067: done getting next task for host managed-node2 46400 1727204552.42072: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204552.42075: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.42079: getting variables 46400 1727204552.42081: in VariableManager get_vars() 46400 1727204552.42115: Calling all_inventory to load vars for managed-node2 46400 1727204552.42117: Calling groups_inventory to load vars for managed-node2 46400 1727204552.42120: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.42131: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.42134: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.42136: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.42979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.44179: done with get_vars() 46400 1727204552.44210: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.041) 0:00:42.727 ***** 46400 1727204552.44313: entering _queue_task() for managed-node2/include_tasks 46400 1727204552.44660: worker is 1 (out of 1 available) 46400 1727204552.44677: exiting _queue_task() for managed-node2/include_tasks 46400 1727204552.44692: done queuing things up, now waiting for results queue to drain 46400 1727204552.44693: waiting for pending results... 46400 1727204552.45015: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204552.45104: in run() - task 0affcd87-79f5-1303-fda8-000000000e8b 46400 1727204552.45114: variable 'ansible_search_path' from source: unknown 46400 1727204552.45117: variable 'ansible_search_path' from source: unknown 46400 1727204552.45154: calling self._execute() 46400 1727204552.45233: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.45238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.45248: variable 'omit' from source: magic vars 46400 1727204552.45527: variable 'ansible_distribution_major_version' from source: facts 46400 1727204552.45537: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204552.45549: _execute() done 46400 1727204552.45552: dumping result to json 46400 1727204552.45556: done dumping result, returning 46400 1727204552.45559: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-000000000e8b] 46400 1727204552.45566: sending task result for task 0affcd87-79f5-1303-fda8-000000000e8b 46400 1727204552.45655: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e8b 46400 1727204552.45658: WORKER PROCESS EXITING 46400 1727204552.45701: no more pending results, returning what we have 46400 1727204552.45706: in VariableManager get_vars() 46400 1727204552.45749: Calling all_inventory to load vars for managed-node2 46400 1727204552.45753: Calling groups_inventory to load vars for managed-node2 46400 1727204552.45757: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.45779: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.45782: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.45786: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.46759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.48381: done with get_vars() 46400 1727204552.48410: variable 'ansible_search_path' from source: unknown 46400 1727204552.48412: variable 'ansible_search_path' from source: unknown 46400 1727204552.48422: variable 'item' from source: include params 46400 1727204552.48538: variable 'item' from source: include params 46400 1727204552.48579: we have included files to process 46400 1727204552.48581: generating all_blocks data 46400 1727204552.48583: done generating all_blocks data 46400 1727204552.48587: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204552.48588: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204552.48590: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204552.49527: done processing included file 46400 1727204552.49529: iterating over new_blocks loaded from include file 46400 1727204552.49531: in VariableManager get_vars() 46400 1727204552.49548: done with get_vars() 46400 1727204552.49550: filtering new block on tags 46400 1727204552.49624: done filtering new block on tags 46400 1727204552.49627: in VariableManager get_vars() 46400 1727204552.49644: done with get_vars() 46400 1727204552.49646: filtering new block on tags 46400 1727204552.49707: done filtering new block on tags 46400 1727204552.49709: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204552.49715: extending task lists for all hosts with included blocks 46400 1727204552.50077: done extending task lists 46400 1727204552.50078: done processing included files 46400 1727204552.50079: results queue empty 46400 1727204552.50080: checking for any_errors_fatal 46400 1727204552.50085: done checking for any_errors_fatal 46400 1727204552.50086: checking for max_fail_percentage 46400 1727204552.50087: done checking for max_fail_percentage 46400 1727204552.50087: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.50088: done checking to see if all hosts have failed 46400 1727204552.50089: getting the remaining hosts for this loop 46400 1727204552.50090: done getting the remaining hosts for this loop 46400 1727204552.50093: getting the next task for host managed-node2 46400 1727204552.50098: done getting next task for host managed-node2 46400 1727204552.50100: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204552.50103: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.50105: getting variables 46400 1727204552.50106: in VariableManager get_vars() 46400 1727204552.50116: Calling all_inventory to load vars for managed-node2 46400 1727204552.50118: Calling groups_inventory to load vars for managed-node2 46400 1727204552.50121: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.50127: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.50129: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.50132: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.51365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.53100: done with get_vars() 46400 1727204552.53123: done getting variables 46400 1727204552.53177: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.088) 0:00:42.816 ***** 46400 1727204552.53213: entering _queue_task() for managed-node2/set_fact 46400 1727204552.53584: worker is 1 (out of 1 available) 46400 1727204552.53597: exiting _queue_task() for managed-node2/set_fact 46400 1727204552.53610: done queuing things up, now waiting for results queue to drain 46400 1727204552.53612: waiting for pending results... 46400 1727204552.53924: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204552.54072: in run() - task 0affcd87-79f5-1303-fda8-000000000f13 46400 1727204552.54093: variable 'ansible_search_path' from source: unknown 46400 1727204552.54102: variable 'ansible_search_path' from source: unknown 46400 1727204552.54142: calling self._execute() 46400 1727204552.54241: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.54253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.54273: variable 'omit' from source: magic vars 46400 1727204552.54639: variable 'ansible_distribution_major_version' from source: facts 46400 1727204552.54654: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204552.54670: variable 'omit' from source: magic vars 46400 1727204552.54736: variable 'omit' from source: magic vars 46400 1727204552.54777: variable 'omit' from source: magic vars 46400 1727204552.54827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204552.54870: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204552.54896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204552.54916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.54932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.54967: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204552.54977: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.54985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.55096: Set connection var ansible_shell_type to sh 46400 1727204552.55111: Set connection var ansible_shell_executable to /bin/sh 46400 1727204552.55121: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204552.55131: Set connection var ansible_connection to ssh 46400 1727204552.55145: Set connection var ansible_pipelining to False 46400 1727204552.55154: Set connection var ansible_timeout to 10 46400 1727204552.55188: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.55197: variable 'ansible_connection' from source: unknown 46400 1727204552.55204: variable 'ansible_module_compression' from source: unknown 46400 1727204552.55211: variable 'ansible_shell_type' from source: unknown 46400 1727204552.55218: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.55226: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.55234: variable 'ansible_pipelining' from source: unknown 46400 1727204552.55243: variable 'ansible_timeout' from source: unknown 46400 1727204552.55254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.55415: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204552.55432: variable 'omit' from source: magic vars 46400 1727204552.55443: starting attempt loop 46400 1727204552.55450: running the handler 46400 1727204552.55476: handler run complete 46400 1727204552.55492: attempt loop complete, returning result 46400 1727204552.55499: _execute() done 46400 1727204552.55505: dumping result to json 46400 1727204552.55513: done dumping result, returning 46400 1727204552.55523: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-000000000f13] 46400 1727204552.55533: sending task result for task 0affcd87-79f5-1303-fda8-000000000f13 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204552.55699: no more pending results, returning what we have 46400 1727204552.55705: results queue empty 46400 1727204552.55706: checking for any_errors_fatal 46400 1727204552.55707: done checking for any_errors_fatal 46400 1727204552.55708: checking for max_fail_percentage 46400 1727204552.55710: done checking for max_fail_percentage 46400 1727204552.55711: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.55712: done checking to see if all hosts have failed 46400 1727204552.55713: getting the remaining hosts for this loop 46400 1727204552.55715: done getting the remaining hosts for this loop 46400 1727204552.55719: getting the next task for host managed-node2 46400 1727204552.55730: done getting next task for host managed-node2 46400 1727204552.55733: ^ task is: TASK: Stat profile file 46400 1727204552.55739: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.55743: getting variables 46400 1727204552.55745: in VariableManager get_vars() 46400 1727204552.55790: Calling all_inventory to load vars for managed-node2 46400 1727204552.55793: Calling groups_inventory to load vars for managed-node2 46400 1727204552.55797: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.55810: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.55813: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.55816: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.56986: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f13 46400 1727204552.56990: WORKER PROCESS EXITING 46400 1727204552.57579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.59315: done with get_vars() 46400 1727204552.59344: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.062) 0:00:42.879 ***** 46400 1727204552.59448: entering _queue_task() for managed-node2/stat 46400 1727204552.59797: worker is 1 (out of 1 available) 46400 1727204552.59810: exiting _queue_task() for managed-node2/stat 46400 1727204552.59823: done queuing things up, now waiting for results queue to drain 46400 1727204552.59824: waiting for pending results... 46400 1727204552.60125: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204552.60239: in run() - task 0affcd87-79f5-1303-fda8-000000000f14 46400 1727204552.60256: variable 'ansible_search_path' from source: unknown 46400 1727204552.60268: variable 'ansible_search_path' from source: unknown 46400 1727204552.60306: calling self._execute() 46400 1727204552.60409: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.60422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.60437: variable 'omit' from source: magic vars 46400 1727204552.60867: variable 'ansible_distribution_major_version' from source: facts 46400 1727204552.60886: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204552.60898: variable 'omit' from source: magic vars 46400 1727204552.60968: variable 'omit' from source: magic vars 46400 1727204552.61080: variable 'profile' from source: play vars 46400 1727204552.61092: variable 'interface' from source: play vars 46400 1727204552.61166: variable 'interface' from source: play vars 46400 1727204552.61192: variable 'omit' from source: magic vars 46400 1727204552.61246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204552.61296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204552.61325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204552.61348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.61374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204552.61409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204552.61419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.61427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.61538: Set connection var ansible_shell_type to sh 46400 1727204552.61554: Set connection var ansible_shell_executable to /bin/sh 46400 1727204552.61569: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204552.61584: Set connection var ansible_connection to ssh 46400 1727204552.61594: Set connection var ansible_pipelining to False 46400 1727204552.61605: Set connection var ansible_timeout to 10 46400 1727204552.61633: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.61643: variable 'ansible_connection' from source: unknown 46400 1727204552.61651: variable 'ansible_module_compression' from source: unknown 46400 1727204552.61658: variable 'ansible_shell_type' from source: unknown 46400 1727204552.61670: variable 'ansible_shell_executable' from source: unknown 46400 1727204552.61677: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204552.61689: variable 'ansible_pipelining' from source: unknown 46400 1727204552.61697: variable 'ansible_timeout' from source: unknown 46400 1727204552.61704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204552.61932: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204552.61949: variable 'omit' from source: magic vars 46400 1727204552.61959: starting attempt loop 46400 1727204552.61973: running the handler 46400 1727204552.61992: _low_level_execute_command(): starting 46400 1727204552.62004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204552.62804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.62820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.62837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.62858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.62912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.62927: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.62943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.62967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.62982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.62998: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.63011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.63027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.63044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.63058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.63077: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.63093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.63174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.63190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.63206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.63284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.64953: stdout chunk (state=3): >>>/root <<< 46400 1727204552.65049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.65134: stderr chunk (state=3): >>><<< 46400 1727204552.65137: stdout chunk (state=3): >>><<< 46400 1727204552.65247: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.65251: _low_level_execute_command(): starting 46400 1727204552.65255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047 `" && echo ansible-tmp-1727204552.6515791-49463-221961187064047="` echo /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047 `" ) && sleep 0' 46400 1727204552.66052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.66056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.66096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.66100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.66103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.66173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.66895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.66914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.68776: stdout chunk (state=3): >>>ansible-tmp-1727204552.6515791-49463-221961187064047=/root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047 <<< 46400 1727204552.68880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.68955: stderr chunk (state=3): >>><<< 46400 1727204552.68958: stdout chunk (state=3): >>><<< 46400 1727204552.69274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204552.6515791-49463-221961187064047=/root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.69278: variable 'ansible_module_compression' from source: unknown 46400 1727204552.69280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204552.69282: variable 'ansible_facts' from source: unknown 46400 1727204552.69284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/AnsiballZ_stat.py 46400 1727204552.69414: Sending initial data 46400 1727204552.69417: Sent initial data (153 bytes) 46400 1727204552.70506: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.70534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.70551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.70577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.70627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.70641: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.70656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.70681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.70700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.70714: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.70727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.70742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.70759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.70779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.70791: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.70810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.70894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.70931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.70955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.71039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.72795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204552.72849: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204552.72892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpvz53_xj9 /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/AnsiballZ_stat.py <<< 46400 1727204552.73242: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204552.74191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.74371: stderr chunk (state=3): >>><<< 46400 1727204552.74374: stdout chunk (state=3): >>><<< 46400 1727204552.74376: done transferring module to remote 46400 1727204552.74378: _low_level_execute_command(): starting 46400 1727204552.74380: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/ /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/AnsiballZ_stat.py && sleep 0' 46400 1727204552.75015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.75025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.75036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.75050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.75093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.75100: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.75110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.75123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.75131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.75137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.75145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.75154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.75169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.75175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.75182: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.75193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.75274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.75281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.75284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.75352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.77105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.77130: stderr chunk (state=3): >>><<< 46400 1727204552.77133: stdout chunk (state=3): >>><<< 46400 1727204552.77230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.77235: _low_level_execute_command(): starting 46400 1727204552.77237: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/AnsiballZ_stat.py && sleep 0' 46400 1727204552.77931: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.77949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.77976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.77997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.78046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.78062: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.78082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.78101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.78113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.78125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.78137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.78152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.78175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.78188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.78199: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.78213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.78293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.78310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.78326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.78417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.91488: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204552.92514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204552.92541: stderr chunk (state=3): >>><<< 46400 1727204552.92544: stdout chunk (state=3): >>><<< 46400 1727204552.92682: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204552.92688: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204552.92695: _low_level_execute_command(): starting 46400 1727204552.92698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204552.6515791-49463-221961187064047/ > /dev/null 2>&1 && sleep 0' 46400 1727204552.93247: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204552.93256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.93273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.93283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.93323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.93327: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204552.93338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.93351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204552.93362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204552.93366: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204552.93380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204552.93389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204552.93401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204552.93410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204552.93416: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204552.93424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204552.93500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204552.93520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204552.93531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204552.93602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204552.95406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204552.95513: stderr chunk (state=3): >>><<< 46400 1727204552.95532: stdout chunk (state=3): >>><<< 46400 1727204552.95776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204552.95779: handler run complete 46400 1727204552.95782: attempt loop complete, returning result 46400 1727204552.95784: _execute() done 46400 1727204552.95786: dumping result to json 46400 1727204552.95788: done dumping result, returning 46400 1727204552.95790: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-000000000f14] 46400 1727204552.95792: sending task result for task 0affcd87-79f5-1303-fda8-000000000f14 46400 1727204552.95878: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f14 46400 1727204552.95882: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204552.95945: no more pending results, returning what we have 46400 1727204552.95950: results queue empty 46400 1727204552.95951: checking for any_errors_fatal 46400 1727204552.95965: done checking for any_errors_fatal 46400 1727204552.95966: checking for max_fail_percentage 46400 1727204552.95969: done checking for max_fail_percentage 46400 1727204552.95970: checking to see if all hosts have failed and the running result is not ok 46400 1727204552.95971: done checking to see if all hosts have failed 46400 1727204552.95971: getting the remaining hosts for this loop 46400 1727204552.95973: done getting the remaining hosts for this loop 46400 1727204552.95978: getting the next task for host managed-node2 46400 1727204552.95987: done getting next task for host managed-node2 46400 1727204552.95991: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204552.95995: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204552.96000: getting variables 46400 1727204552.96002: in VariableManager get_vars() 46400 1727204552.96040: Calling all_inventory to load vars for managed-node2 46400 1727204552.96043: Calling groups_inventory to load vars for managed-node2 46400 1727204552.96046: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204552.96058: Calling all_plugins_play to load vars for managed-node2 46400 1727204552.96065: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204552.96069: Calling groups_plugins_play to load vars for managed-node2 46400 1727204552.98049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204552.99879: done with get_vars() 46400 1727204552.99907: done getting variables 46400 1727204552.99982: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:02:32 -0400 (0:00:00.405) 0:00:43.284 ***** 46400 1727204553.00020: entering _queue_task() for managed-node2/set_fact 46400 1727204553.00405: worker is 1 (out of 1 available) 46400 1727204553.00424: exiting _queue_task() for managed-node2/set_fact 46400 1727204553.00440: done queuing things up, now waiting for results queue to drain 46400 1727204553.00441: waiting for pending results... 46400 1727204553.00753: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204553.00907: in run() - task 0affcd87-79f5-1303-fda8-000000000f15 46400 1727204553.00927: variable 'ansible_search_path' from source: unknown 46400 1727204553.00934: variable 'ansible_search_path' from source: unknown 46400 1727204553.00982: calling self._execute() 46400 1727204553.01085: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.01105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.01122: variable 'omit' from source: magic vars 46400 1727204553.01540: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.01568: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.01711: variable 'profile_stat' from source: set_fact 46400 1727204553.01725: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204553.01734: when evaluation is False, skipping this task 46400 1727204553.01746: _execute() done 46400 1727204553.01763: dumping result to json 46400 1727204553.01777: done dumping result, returning 46400 1727204553.01788: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-000000000f15] 46400 1727204553.01800: sending task result for task 0affcd87-79f5-1303-fda8-000000000f15 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204553.01977: no more pending results, returning what we have 46400 1727204553.01984: results queue empty 46400 1727204553.01985: checking for any_errors_fatal 46400 1727204553.01997: done checking for any_errors_fatal 46400 1727204553.01998: checking for max_fail_percentage 46400 1727204553.02000: done checking for max_fail_percentage 46400 1727204553.02002: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.02003: done checking to see if all hosts have failed 46400 1727204553.02003: getting the remaining hosts for this loop 46400 1727204553.02005: done getting the remaining hosts for this loop 46400 1727204553.02010: getting the next task for host managed-node2 46400 1727204553.02020: done getting next task for host managed-node2 46400 1727204553.02024: ^ task is: TASK: Get NM profile info 46400 1727204553.02030: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.02036: getting variables 46400 1727204553.02038: in VariableManager get_vars() 46400 1727204553.02084: Calling all_inventory to load vars for managed-node2 46400 1727204553.02088: Calling groups_inventory to load vars for managed-node2 46400 1727204553.02092: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.02106: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.02110: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.02113: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.03116: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f15 46400 1727204553.03120: WORKER PROCESS EXITING 46400 1727204553.04035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.05490: done with get_vars() 46400 1727204553.05511: done getting variables 46400 1727204553.05563: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.055) 0:00:43.340 ***** 46400 1727204553.05590: entering _queue_task() for managed-node2/shell 46400 1727204553.05834: worker is 1 (out of 1 available) 46400 1727204553.05847: exiting _queue_task() for managed-node2/shell 46400 1727204553.05860: done queuing things up, now waiting for results queue to drain 46400 1727204553.05861: waiting for pending results... 46400 1727204553.06043: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204553.06117: in run() - task 0affcd87-79f5-1303-fda8-000000000f16 46400 1727204553.06127: variable 'ansible_search_path' from source: unknown 46400 1727204553.06131: variable 'ansible_search_path' from source: unknown 46400 1727204553.06158: calling self._execute() 46400 1727204553.06235: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.06241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.06249: variable 'omit' from source: magic vars 46400 1727204553.06538: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.06549: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.06555: variable 'omit' from source: magic vars 46400 1727204553.06597: variable 'omit' from source: magic vars 46400 1727204553.06675: variable 'profile' from source: play vars 46400 1727204553.06679: variable 'interface' from source: play vars 46400 1727204553.06724: variable 'interface' from source: play vars 46400 1727204553.06743: variable 'omit' from source: magic vars 46400 1727204553.06787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204553.06844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204553.06912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204553.06929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.06963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.07681: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204553.07685: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.07687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.07690: Set connection var ansible_shell_type to sh 46400 1727204553.07692: Set connection var ansible_shell_executable to /bin/sh 46400 1727204553.07694: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204553.07696: Set connection var ansible_connection to ssh 46400 1727204553.07697: Set connection var ansible_pipelining to False 46400 1727204553.07700: Set connection var ansible_timeout to 10 46400 1727204553.07701: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.07704: variable 'ansible_connection' from source: unknown 46400 1727204553.07706: variable 'ansible_module_compression' from source: unknown 46400 1727204553.07708: variable 'ansible_shell_type' from source: unknown 46400 1727204553.07709: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.07711: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.07713: variable 'ansible_pipelining' from source: unknown 46400 1727204553.07715: variable 'ansible_timeout' from source: unknown 46400 1727204553.07717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.07721: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204553.07723: variable 'omit' from source: magic vars 46400 1727204553.07725: starting attempt loop 46400 1727204553.07726: running the handler 46400 1727204553.07729: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204553.07731: _low_level_execute_command(): starting 46400 1727204553.07733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204553.08414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.08484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204553.08498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.08510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.08580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.10191: stdout chunk (state=3): >>>/root <<< 46400 1727204553.10293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204553.10342: stderr chunk (state=3): >>><<< 46400 1727204553.10346: stdout chunk (state=3): >>><<< 46400 1727204553.10374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204553.10388: _low_level_execute_command(): starting 46400 1727204553.10394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058 `" && echo ansible-tmp-1727204553.1037629-49487-107936518139058="` echo /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058 `" ) && sleep 0' 46400 1727204553.10850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.10854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.10889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204553.10895: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.10936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204553.10952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.10970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.11029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.12875: stdout chunk (state=3): >>>ansible-tmp-1727204553.1037629-49487-107936518139058=/root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058 <<< 46400 1727204553.13052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204553.13056: stderr chunk (state=3): >>><<< 46400 1727204553.13059: stdout chunk (state=3): >>><<< 46400 1727204553.13374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204553.1037629-49487-107936518139058=/root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204553.13377: variable 'ansible_module_compression' from source: unknown 46400 1727204553.13380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204553.13382: variable 'ansible_facts' from source: unknown 46400 1727204553.13384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/AnsiballZ_command.py 46400 1727204553.13777: Sending initial data 46400 1727204553.13780: Sent initial data (156 bytes) 46400 1727204553.14728: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204553.14737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.14747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.14766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.14803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204553.14811: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204553.14820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.14834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204553.14841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204553.14848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204553.14856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.14869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.14885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.14892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204553.14899: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204553.14907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.14983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204553.14998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.15002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.15071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.16826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204553.16869: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204553.16908: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpoh9y5ffx /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/AnsiballZ_command.py <<< 46400 1727204553.16943: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204553.18123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204553.18208: stderr chunk (state=3): >>><<< 46400 1727204553.18211: stdout chunk (state=3): >>><<< 46400 1727204553.18234: done transferring module to remote 46400 1727204553.18245: _low_level_execute_command(): starting 46400 1727204553.18248: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/ /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/AnsiballZ_command.py && sleep 0' 46400 1727204553.18924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204553.18928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.18945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.18950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.19043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204553.19047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204553.19068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.19072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.19090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.19181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204553.19188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.19203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.19270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.21068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204553.21123: stderr chunk (state=3): >>><<< 46400 1727204553.21126: stdout chunk (state=3): >>><<< 46400 1727204553.21149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204553.21153: _low_level_execute_command(): starting 46400 1727204553.21155: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/AnsiballZ_command.py && sleep 0' 46400 1727204553.21873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.21879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.21895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.21931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.21937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204553.21949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.21955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204553.21963: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204553.21985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.22066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.22086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.22158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.37319: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:33.353932", "end": "2024-09-24 15:02:33.372154", "delta": "0:00:00.018222", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204553.38583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204553.38587: stdout chunk (state=3): >>><<< 46400 1727204553.38593: stderr chunk (state=3): >>><<< 46400 1727204553.38615: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:33.353932", "end": "2024-09-24 15:02:33.372154", "delta": "0:00:00.018222", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204553.38655: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204553.38669: _low_level_execute_command(): starting 46400 1727204553.38672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204553.1037629-49487-107936518139058/ > /dev/null 2>&1 && sleep 0' 46400 1727204553.39313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204553.39321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.39332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.39345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.39388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204553.39396: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204553.39404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.39418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204553.39425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204553.39432: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204553.39440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204553.39449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204553.39463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204553.39471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204553.39478: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204553.39487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204553.39559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204553.39576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204553.39584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204553.39685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204553.41414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204553.41508: stderr chunk (state=3): >>><<< 46400 1727204553.41514: stdout chunk (state=3): >>><<< 46400 1727204553.41537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204553.41544: handler run complete 46400 1727204553.41576: Evaluated conditional (False): False 46400 1727204553.41588: attempt loop complete, returning result 46400 1727204553.41591: _execute() done 46400 1727204553.41594: dumping result to json 46400 1727204553.41599: done dumping result, returning 46400 1727204553.41608: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-000000000f16] 46400 1727204553.41614: sending task result for task 0affcd87-79f5-1303-fda8-000000000f16 46400 1727204553.41724: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f16 46400 1727204553.41727: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018222", "end": "2024-09-24 15:02:33.372154", "rc": 0, "start": "2024-09-24 15:02:33.353932" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 46400 1727204553.41805: no more pending results, returning what we have 46400 1727204553.41810: results queue empty 46400 1727204553.41812: checking for any_errors_fatal 46400 1727204553.41818: done checking for any_errors_fatal 46400 1727204553.41819: checking for max_fail_percentage 46400 1727204553.41821: done checking for max_fail_percentage 46400 1727204553.41822: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.41823: done checking to see if all hosts have failed 46400 1727204553.41824: getting the remaining hosts for this loop 46400 1727204553.41826: done getting the remaining hosts for this loop 46400 1727204553.41830: getting the next task for host managed-node2 46400 1727204553.41838: done getting next task for host managed-node2 46400 1727204553.41841: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204553.41846: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.41850: getting variables 46400 1727204553.41851: in VariableManager get_vars() 46400 1727204553.41886: Calling all_inventory to load vars for managed-node2 46400 1727204553.41889: Calling groups_inventory to load vars for managed-node2 46400 1727204553.41892: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.41903: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.41906: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.41908: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.43799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.45502: done with get_vars() 46400 1727204553.45526: done getting variables 46400 1727204553.45589: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.400) 0:00:43.740 ***** 46400 1727204553.45628: entering _queue_task() for managed-node2/set_fact 46400 1727204553.45993: worker is 1 (out of 1 available) 46400 1727204553.46005: exiting _queue_task() for managed-node2/set_fact 46400 1727204553.46018: done queuing things up, now waiting for results queue to drain 46400 1727204553.46020: waiting for pending results... 46400 1727204553.46323: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204553.46458: in run() - task 0affcd87-79f5-1303-fda8-000000000f17 46400 1727204553.46475: variable 'ansible_search_path' from source: unknown 46400 1727204553.46480: variable 'ansible_search_path' from source: unknown 46400 1727204553.46514: calling self._execute() 46400 1727204553.46606: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.46612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.46622: variable 'omit' from source: magic vars 46400 1727204553.47007: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.47024: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.47157: variable 'nm_profile_exists' from source: set_fact 46400 1727204553.47170: Evaluated conditional (nm_profile_exists.rc == 0): True 46400 1727204553.47177: variable 'omit' from source: magic vars 46400 1727204553.47236: variable 'omit' from source: magic vars 46400 1727204553.47273: variable 'omit' from source: magic vars 46400 1727204553.47321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204553.47467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204553.47471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204553.47473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.47476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.47478: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204553.47480: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.47483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.47562: Set connection var ansible_shell_type to sh 46400 1727204553.47573: Set connection var ansible_shell_executable to /bin/sh 46400 1727204553.47579: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204553.47584: Set connection var ansible_connection to ssh 46400 1727204553.47590: Set connection var ansible_pipelining to False 46400 1727204553.47597: Set connection var ansible_timeout to 10 46400 1727204553.47622: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.47627: variable 'ansible_connection' from source: unknown 46400 1727204553.47630: variable 'ansible_module_compression' from source: unknown 46400 1727204553.47632: variable 'ansible_shell_type' from source: unknown 46400 1727204553.47635: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.47637: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.47642: variable 'ansible_pipelining' from source: unknown 46400 1727204553.47644: variable 'ansible_timeout' from source: unknown 46400 1727204553.47649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.47798: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204553.47810: variable 'omit' from source: magic vars 46400 1727204553.47813: starting attempt loop 46400 1727204553.47816: running the handler 46400 1727204553.47829: handler run complete 46400 1727204553.47844: attempt loop complete, returning result 46400 1727204553.47847: _execute() done 46400 1727204553.47849: dumping result to json 46400 1727204553.47852: done dumping result, returning 46400 1727204553.47858: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-000000000f17] 46400 1727204553.47865: sending task result for task 0affcd87-79f5-1303-fda8-000000000f17 46400 1727204553.47955: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f17 46400 1727204553.47957: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 46400 1727204553.48040: no more pending results, returning what we have 46400 1727204553.48045: results queue empty 46400 1727204553.48046: checking for any_errors_fatal 46400 1727204553.48057: done checking for any_errors_fatal 46400 1727204553.48058: checking for max_fail_percentage 46400 1727204553.48061: done checking for max_fail_percentage 46400 1727204553.48062: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.48063: done checking to see if all hosts have failed 46400 1727204553.48065: getting the remaining hosts for this loop 46400 1727204553.48067: done getting the remaining hosts for this loop 46400 1727204553.48072: getting the next task for host managed-node2 46400 1727204553.48085: done getting next task for host managed-node2 46400 1727204553.48088: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204553.48093: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.48097: getting variables 46400 1727204553.48099: in VariableManager get_vars() 46400 1727204553.48136: Calling all_inventory to load vars for managed-node2 46400 1727204553.48139: Calling groups_inventory to load vars for managed-node2 46400 1727204553.48143: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.48155: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.48158: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.48161: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.49849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.51733: done with get_vars() 46400 1727204553.51761: done getting variables 46400 1727204553.51823: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.51955: variable 'profile' from source: play vars 46400 1727204553.51959: variable 'interface' from source: play vars 46400 1727204553.52024: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.064) 0:00:43.805 ***** 46400 1727204553.52059: entering _queue_task() for managed-node2/command 46400 1727204553.52411: worker is 1 (out of 1 available) 46400 1727204553.52424: exiting _queue_task() for managed-node2/command 46400 1727204553.52436: done queuing things up, now waiting for results queue to drain 46400 1727204553.52438: waiting for pending results... 46400 1727204553.52741: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204553.52869: in run() - task 0affcd87-79f5-1303-fda8-000000000f19 46400 1727204553.52885: variable 'ansible_search_path' from source: unknown 46400 1727204553.52888: variable 'ansible_search_path' from source: unknown 46400 1727204553.52925: calling self._execute() 46400 1727204553.53025: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.53030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.53041: variable 'omit' from source: magic vars 46400 1727204553.53424: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.53438: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.53568: variable 'profile_stat' from source: set_fact 46400 1727204553.53579: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204553.53582: when evaluation is False, skipping this task 46400 1727204553.53585: _execute() done 46400 1727204553.53587: dumping result to json 46400 1727204553.53590: done dumping result, returning 46400 1727204553.53602: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000f19] 46400 1727204553.53609: sending task result for task 0affcd87-79f5-1303-fda8-000000000f19 46400 1727204553.53703: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f19 46400 1727204553.53707: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204553.53758: no more pending results, returning what we have 46400 1727204553.53763: results queue empty 46400 1727204553.53767: checking for any_errors_fatal 46400 1727204553.53775: done checking for any_errors_fatal 46400 1727204553.53776: checking for max_fail_percentage 46400 1727204553.53779: done checking for max_fail_percentage 46400 1727204553.53780: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.53781: done checking to see if all hosts have failed 46400 1727204553.53782: getting the remaining hosts for this loop 46400 1727204553.53783: done getting the remaining hosts for this loop 46400 1727204553.53788: getting the next task for host managed-node2 46400 1727204553.53797: done getting next task for host managed-node2 46400 1727204553.53801: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204553.53806: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.53813: getting variables 46400 1727204553.53815: in VariableManager get_vars() 46400 1727204553.53851: Calling all_inventory to load vars for managed-node2 46400 1727204553.53854: Calling groups_inventory to load vars for managed-node2 46400 1727204553.53858: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.53875: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.53878: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.53882: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.55549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.57283: done with get_vars() 46400 1727204553.57313: done getting variables 46400 1727204553.57375: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.57493: variable 'profile' from source: play vars 46400 1727204553.57498: variable 'interface' from source: play vars 46400 1727204553.57565: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.055) 0:00:43.860 ***** 46400 1727204553.57597: entering _queue_task() for managed-node2/set_fact 46400 1727204553.57919: worker is 1 (out of 1 available) 46400 1727204553.57932: exiting _queue_task() for managed-node2/set_fact 46400 1727204553.57946: done queuing things up, now waiting for results queue to drain 46400 1727204553.57947: waiting for pending results... 46400 1727204553.58230: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204553.58369: in run() - task 0affcd87-79f5-1303-fda8-000000000f1a 46400 1727204553.58379: variable 'ansible_search_path' from source: unknown 46400 1727204553.58382: variable 'ansible_search_path' from source: unknown 46400 1727204553.58429: calling self._execute() 46400 1727204553.58531: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.58535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.58544: variable 'omit' from source: magic vars 46400 1727204553.58931: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.58951: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.59084: variable 'profile_stat' from source: set_fact 46400 1727204553.59094: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204553.59097: when evaluation is False, skipping this task 46400 1727204553.59101: _execute() done 46400 1727204553.59104: dumping result to json 46400 1727204553.59106: done dumping result, returning 46400 1727204553.59109: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000f1a] 46400 1727204553.59117: sending task result for task 0affcd87-79f5-1303-fda8-000000000f1a 46400 1727204553.59213: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f1a 46400 1727204553.59217: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204553.59275: no more pending results, returning what we have 46400 1727204553.59280: results queue empty 46400 1727204553.59281: checking for any_errors_fatal 46400 1727204553.59290: done checking for any_errors_fatal 46400 1727204553.59291: checking for max_fail_percentage 46400 1727204553.59294: done checking for max_fail_percentage 46400 1727204553.59295: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.59296: done checking to see if all hosts have failed 46400 1727204553.59297: getting the remaining hosts for this loop 46400 1727204553.59299: done getting the remaining hosts for this loop 46400 1727204553.59303: getting the next task for host managed-node2 46400 1727204553.59313: done getting next task for host managed-node2 46400 1727204553.59316: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204553.59322: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.59327: getting variables 46400 1727204553.59329: in VariableManager get_vars() 46400 1727204553.59371: Calling all_inventory to load vars for managed-node2 46400 1727204553.59374: Calling groups_inventory to load vars for managed-node2 46400 1727204553.59379: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.59393: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.59397: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.59400: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.61252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.62954: done with get_vars() 46400 1727204553.62985: done getting variables 46400 1727204553.63056: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.63181: variable 'profile' from source: play vars 46400 1727204553.63185: variable 'interface' from source: play vars 46400 1727204553.63251: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.056) 0:00:43.917 ***** 46400 1727204553.63285: entering _queue_task() for managed-node2/command 46400 1727204553.63635: worker is 1 (out of 1 available) 46400 1727204553.63649: exiting _queue_task() for managed-node2/command 46400 1727204553.63661: done queuing things up, now waiting for results queue to drain 46400 1727204553.63662: waiting for pending results... 46400 1727204553.63950: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204553.64083: in run() - task 0affcd87-79f5-1303-fda8-000000000f1b 46400 1727204553.64095: variable 'ansible_search_path' from source: unknown 46400 1727204553.64098: variable 'ansible_search_path' from source: unknown 46400 1727204553.64141: calling self._execute() 46400 1727204553.64232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.64236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.64250: variable 'omit' from source: magic vars 46400 1727204553.64610: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.64620: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.64748: variable 'profile_stat' from source: set_fact 46400 1727204553.64765: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204553.64769: when evaluation is False, skipping this task 46400 1727204553.64772: _execute() done 46400 1727204553.64775: dumping result to json 46400 1727204553.64779: done dumping result, returning 46400 1727204553.64785: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000f1b] 46400 1727204553.64791: sending task result for task 0affcd87-79f5-1303-fda8-000000000f1b 46400 1727204553.64887: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f1b 46400 1727204553.64891: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204553.64944: no more pending results, returning what we have 46400 1727204553.64949: results queue empty 46400 1727204553.64950: checking for any_errors_fatal 46400 1727204553.64960: done checking for any_errors_fatal 46400 1727204553.64961: checking for max_fail_percentage 46400 1727204553.64963: done checking for max_fail_percentage 46400 1727204553.64965: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.64966: done checking to see if all hosts have failed 46400 1727204553.64967: getting the remaining hosts for this loop 46400 1727204553.64969: done getting the remaining hosts for this loop 46400 1727204553.64973: getting the next task for host managed-node2 46400 1727204553.64984: done getting next task for host managed-node2 46400 1727204553.64986: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204553.64992: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.64997: getting variables 46400 1727204553.64998: in VariableManager get_vars() 46400 1727204553.65036: Calling all_inventory to load vars for managed-node2 46400 1727204553.65039: Calling groups_inventory to load vars for managed-node2 46400 1727204553.65044: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.65058: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.65061: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.65070: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.66792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.68506: done with get_vars() 46400 1727204553.68540: done getting variables 46400 1727204553.68606: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.68730: variable 'profile' from source: play vars 46400 1727204553.68734: variable 'interface' from source: play vars 46400 1727204553.68798: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.055) 0:00:43.972 ***** 46400 1727204553.68830: entering _queue_task() for managed-node2/set_fact 46400 1727204553.69267: worker is 1 (out of 1 available) 46400 1727204553.69279: exiting _queue_task() for managed-node2/set_fact 46400 1727204553.69293: done queuing things up, now waiting for results queue to drain 46400 1727204553.69295: waiting for pending results... 46400 1727204553.69862: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204553.70182: in run() - task 0affcd87-79f5-1303-fda8-000000000f1c 46400 1727204553.70202: variable 'ansible_search_path' from source: unknown 46400 1727204553.70206: variable 'ansible_search_path' from source: unknown 46400 1727204553.70240: calling self._execute() 46400 1727204553.70449: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.70455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.70466: variable 'omit' from source: magic vars 46400 1727204553.71140: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.71152: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.71293: variable 'profile_stat' from source: set_fact 46400 1727204553.71303: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204553.71306: when evaluation is False, skipping this task 46400 1727204553.71309: _execute() done 46400 1727204553.71312: dumping result to json 46400 1727204553.71315: done dumping result, returning 46400 1727204553.71320: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000000f1c] 46400 1727204553.71327: sending task result for task 0affcd87-79f5-1303-fda8-000000000f1c 46400 1727204553.71474: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f1c 46400 1727204553.71478: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204553.71525: no more pending results, returning what we have 46400 1727204553.71530: results queue empty 46400 1727204553.71531: checking for any_errors_fatal 46400 1727204553.71538: done checking for any_errors_fatal 46400 1727204553.71539: checking for max_fail_percentage 46400 1727204553.71542: done checking for max_fail_percentage 46400 1727204553.71543: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.71544: done checking to see if all hosts have failed 46400 1727204553.71544: getting the remaining hosts for this loop 46400 1727204553.71547: done getting the remaining hosts for this loop 46400 1727204553.71551: getting the next task for host managed-node2 46400 1727204553.71566: done getting next task for host managed-node2 46400 1727204553.71570: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 46400 1727204553.71574: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.71580: getting variables 46400 1727204553.71582: in VariableManager get_vars() 46400 1727204553.71622: Calling all_inventory to load vars for managed-node2 46400 1727204553.71625: Calling groups_inventory to load vars for managed-node2 46400 1727204553.71629: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.71646: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.71649: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.71652: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.80537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.82209: done with get_vars() 46400 1727204553.82244: done getting variables 46400 1727204553.82300: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.82403: variable 'profile' from source: play vars 46400 1727204553.82406: variable 'interface' from source: play vars 46400 1727204553.82467: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.136) 0:00:44.109 ***** 46400 1727204553.82494: entering _queue_task() for managed-node2/assert 46400 1727204553.82827: worker is 1 (out of 1 available) 46400 1727204553.82843: exiting _queue_task() for managed-node2/assert 46400 1727204553.82855: done queuing things up, now waiting for results queue to drain 46400 1727204553.82857: waiting for pending results... 46400 1727204553.83148: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 46400 1727204553.83272: in run() - task 0affcd87-79f5-1303-fda8-000000000e8c 46400 1727204553.83282: variable 'ansible_search_path' from source: unknown 46400 1727204553.83287: variable 'ansible_search_path' from source: unknown 46400 1727204553.83324: calling self._execute() 46400 1727204553.83419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.83423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.83434: variable 'omit' from source: magic vars 46400 1727204553.83809: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.83820: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.83824: variable 'omit' from source: magic vars 46400 1727204553.83877: variable 'omit' from source: magic vars 46400 1727204553.83977: variable 'profile' from source: play vars 46400 1727204553.83980: variable 'interface' from source: play vars 46400 1727204553.84037: variable 'interface' from source: play vars 46400 1727204553.84063: variable 'omit' from source: magic vars 46400 1727204553.84108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204553.84142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204553.84169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204553.84189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.84200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.84228: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204553.84232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.84234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.84331: Set connection var ansible_shell_type to sh 46400 1727204553.84341: Set connection var ansible_shell_executable to /bin/sh 46400 1727204553.84347: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204553.84350: Set connection var ansible_connection to ssh 46400 1727204553.84357: Set connection var ansible_pipelining to False 46400 1727204553.84366: Set connection var ansible_timeout to 10 46400 1727204553.84391: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.84394: variable 'ansible_connection' from source: unknown 46400 1727204553.84397: variable 'ansible_module_compression' from source: unknown 46400 1727204553.84399: variable 'ansible_shell_type' from source: unknown 46400 1727204553.84401: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.84403: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.84406: variable 'ansible_pipelining' from source: unknown 46400 1727204553.84409: variable 'ansible_timeout' from source: unknown 46400 1727204553.84411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.84550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204553.84566: variable 'omit' from source: magic vars 46400 1727204553.84569: starting attempt loop 46400 1727204553.84572: running the handler 46400 1727204553.84673: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204553.84676: Evaluated conditional (lsr_net_profile_exists): True 46400 1727204553.84686: handler run complete 46400 1727204553.84698: attempt loop complete, returning result 46400 1727204553.84706: _execute() done 46400 1727204553.84709: dumping result to json 46400 1727204553.84712: done dumping result, returning 46400 1727204553.84719: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [0affcd87-79f5-1303-fda8-000000000e8c] 46400 1727204553.84725: sending task result for task 0affcd87-79f5-1303-fda8-000000000e8c ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204553.84863: no more pending results, returning what we have 46400 1727204553.84869: results queue empty 46400 1727204553.84871: checking for any_errors_fatal 46400 1727204553.84880: done checking for any_errors_fatal 46400 1727204553.84881: checking for max_fail_percentage 46400 1727204553.84883: done checking for max_fail_percentage 46400 1727204553.84884: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.84885: done checking to see if all hosts have failed 46400 1727204553.84886: getting the remaining hosts for this loop 46400 1727204553.84887: done getting the remaining hosts for this loop 46400 1727204553.84891: getting the next task for host managed-node2 46400 1727204553.84900: done getting next task for host managed-node2 46400 1727204553.84904: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 46400 1727204553.84907: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.84912: getting variables 46400 1727204553.84913: in VariableManager get_vars() 46400 1727204553.84952: Calling all_inventory to load vars for managed-node2 46400 1727204553.84954: Calling groups_inventory to load vars for managed-node2 46400 1727204553.84958: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.84973: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.84976: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.84980: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.85671: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e8c 46400 1727204553.85674: WORKER PROCESS EXITING 46400 1727204553.86637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.89078: done with get_vars() 46400 1727204553.89115: done getting variables 46400 1727204553.89190: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.89317: variable 'profile' from source: play vars 46400 1727204553.89321: variable 'interface' from source: play vars 46400 1727204553.89383: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.069) 0:00:44.178 ***** 46400 1727204553.89421: entering _queue_task() for managed-node2/assert 46400 1727204553.89746: worker is 1 (out of 1 available) 46400 1727204553.89758: exiting _queue_task() for managed-node2/assert 46400 1727204553.89772: done queuing things up, now waiting for results queue to drain 46400 1727204553.89774: waiting for pending results... 46400 1727204553.90057: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 46400 1727204553.90180: in run() - task 0affcd87-79f5-1303-fda8-000000000e8d 46400 1727204553.90192: variable 'ansible_search_path' from source: unknown 46400 1727204553.90196: variable 'ansible_search_path' from source: unknown 46400 1727204553.90235: calling self._execute() 46400 1727204553.90328: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.90333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.90342: variable 'omit' from source: magic vars 46400 1727204553.90715: variable 'ansible_distribution_major_version' from source: facts 46400 1727204553.90726: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204553.90733: variable 'omit' from source: magic vars 46400 1727204553.90786: variable 'omit' from source: magic vars 46400 1727204553.90884: variable 'profile' from source: play vars 46400 1727204553.90889: variable 'interface' from source: play vars 46400 1727204553.91063: variable 'interface' from source: play vars 46400 1727204553.91069: variable 'omit' from source: magic vars 46400 1727204553.91380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204553.91384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204553.91387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204553.91390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.91393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204553.91396: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204553.91398: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.91400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.91403: Set connection var ansible_shell_type to sh 46400 1727204553.91405: Set connection var ansible_shell_executable to /bin/sh 46400 1727204553.91408: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204553.91411: Set connection var ansible_connection to ssh 46400 1727204553.91414: Set connection var ansible_pipelining to False 46400 1727204553.91417: Set connection var ansible_timeout to 10 46400 1727204553.91419: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.91423: variable 'ansible_connection' from source: unknown 46400 1727204553.91425: variable 'ansible_module_compression' from source: unknown 46400 1727204553.91428: variable 'ansible_shell_type' from source: unknown 46400 1727204553.91431: variable 'ansible_shell_executable' from source: unknown 46400 1727204553.91433: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204553.91436: variable 'ansible_pipelining' from source: unknown 46400 1727204553.91439: variable 'ansible_timeout' from source: unknown 46400 1727204553.91441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204553.91568: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204553.91577: variable 'omit' from source: magic vars 46400 1727204553.91585: starting attempt loop 46400 1727204553.91588: running the handler 46400 1727204553.91701: variable 'lsr_net_profile_ansible_managed' from source: set_fact 46400 1727204553.91704: Evaluated conditional (lsr_net_profile_ansible_managed): True 46400 1727204553.91711: handler run complete 46400 1727204553.91727: attempt loop complete, returning result 46400 1727204553.91730: _execute() done 46400 1727204553.91733: dumping result to json 46400 1727204553.91735: done dumping result, returning 46400 1727204553.91741: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcd87-79f5-1303-fda8-000000000e8d] 46400 1727204553.91747: sending task result for task 0affcd87-79f5-1303-fda8-000000000e8d 46400 1727204553.91839: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e8d 46400 1727204553.91842: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204553.91925: no more pending results, returning what we have 46400 1727204553.91929: results queue empty 46400 1727204553.91930: checking for any_errors_fatal 46400 1727204553.91937: done checking for any_errors_fatal 46400 1727204553.91938: checking for max_fail_percentage 46400 1727204553.91940: done checking for max_fail_percentage 46400 1727204553.91941: checking to see if all hosts have failed and the running result is not ok 46400 1727204553.91942: done checking to see if all hosts have failed 46400 1727204553.91943: getting the remaining hosts for this loop 46400 1727204553.91945: done getting the remaining hosts for this loop 46400 1727204553.91949: getting the next task for host managed-node2 46400 1727204553.91957: done getting next task for host managed-node2 46400 1727204553.91959: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 46400 1727204553.91963: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204553.91968: getting variables 46400 1727204553.91970: in VariableManager get_vars() 46400 1727204553.92005: Calling all_inventory to load vars for managed-node2 46400 1727204553.92007: Calling groups_inventory to load vars for managed-node2 46400 1727204553.92011: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204553.92022: Calling all_plugins_play to load vars for managed-node2 46400 1727204553.92025: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204553.92028: Calling groups_plugins_play to load vars for managed-node2 46400 1727204553.94152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204553.98714: done with get_vars() 46400 1727204553.98755: done getting variables 46400 1727204553.98827: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204553.98965: variable 'profile' from source: play vars 46400 1727204553.98970: variable 'interface' from source: play vars 46400 1727204553.99047: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:02:33 -0400 (0:00:00.096) 0:00:44.275 ***** 46400 1727204553.99085: entering _queue_task() for managed-node2/assert 46400 1727204553.99556: worker is 1 (out of 1 available) 46400 1727204553.99572: exiting _queue_task() for managed-node2/assert 46400 1727204553.99586: done queuing things up, now waiting for results queue to drain 46400 1727204553.99588: waiting for pending results... 46400 1727204554.00720: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 46400 1727204554.00982: in run() - task 0affcd87-79f5-1303-fda8-000000000e8e 46400 1727204554.01240: variable 'ansible_search_path' from source: unknown 46400 1727204554.01267: variable 'ansible_search_path' from source: unknown 46400 1727204554.01323: calling self._execute() 46400 1727204554.01552: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.01572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.01587: variable 'omit' from source: magic vars 46400 1727204554.02496: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.02508: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.02514: variable 'omit' from source: magic vars 46400 1727204554.02558: variable 'omit' from source: magic vars 46400 1727204554.02659: variable 'profile' from source: play vars 46400 1727204554.02667: variable 'interface' from source: play vars 46400 1727204554.02728: variable 'interface' from source: play vars 46400 1727204554.02747: variable 'omit' from source: magic vars 46400 1727204554.02793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204554.02833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.02854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204554.02872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.02885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.02913: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.02917: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.02925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.03015: Set connection var ansible_shell_type to sh 46400 1727204554.03024: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.03034: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.03039: Set connection var ansible_connection to ssh 46400 1727204554.03045: Set connection var ansible_pipelining to False 46400 1727204554.03050: Set connection var ansible_timeout to 10 46400 1727204554.03078: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.03082: variable 'ansible_connection' from source: unknown 46400 1727204554.03085: variable 'ansible_module_compression' from source: unknown 46400 1727204554.03087: variable 'ansible_shell_type' from source: unknown 46400 1727204554.03089: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.03092: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.03094: variable 'ansible_pipelining' from source: unknown 46400 1727204554.03096: variable 'ansible_timeout' from source: unknown 46400 1727204554.03099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.03236: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.03251: variable 'omit' from source: magic vars 46400 1727204554.03256: starting attempt loop 46400 1727204554.03259: running the handler 46400 1727204554.03367: variable 'lsr_net_profile_fingerprint' from source: set_fact 46400 1727204554.03371: Evaluated conditional (lsr_net_profile_fingerprint): True 46400 1727204554.03376: handler run complete 46400 1727204554.03389: attempt loop complete, returning result 46400 1727204554.03392: _execute() done 46400 1727204554.03395: dumping result to json 46400 1727204554.03397: done dumping result, returning 46400 1727204554.03404: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [0affcd87-79f5-1303-fda8-000000000e8e] 46400 1727204554.03409: sending task result for task 0affcd87-79f5-1303-fda8-000000000e8e ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204554.03550: no more pending results, returning what we have 46400 1727204554.03554: results queue empty 46400 1727204554.03556: checking for any_errors_fatal 46400 1727204554.03566: done checking for any_errors_fatal 46400 1727204554.03567: checking for max_fail_percentage 46400 1727204554.03570: done checking for max_fail_percentage 46400 1727204554.03571: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.03572: done checking to see if all hosts have failed 46400 1727204554.03573: getting the remaining hosts for this loop 46400 1727204554.03575: done getting the remaining hosts for this loop 46400 1727204554.03579: getting the next task for host managed-node2 46400 1727204554.03590: done getting next task for host managed-node2 46400 1727204554.03594: ^ task is: TASK: Conditional asserts 46400 1727204554.03597: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.03601: getting variables 46400 1727204554.03603: in VariableManager get_vars() 46400 1727204554.03641: Calling all_inventory to load vars for managed-node2 46400 1727204554.03644: Calling groups_inventory to load vars for managed-node2 46400 1727204554.03648: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.03661: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.03666: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.03670: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.04473: done sending task result for task 0affcd87-79f5-1303-fda8-000000000e8e 46400 1727204554.04477: WORKER PROCESS EXITING 46400 1727204554.05785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.07807: done with get_vars() 46400 1727204554.07831: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.088) 0:00:44.363 ***** 46400 1727204554.07941: entering _queue_task() for managed-node2/include_tasks 46400 1727204554.08301: worker is 1 (out of 1 available) 46400 1727204554.08315: exiting _queue_task() for managed-node2/include_tasks 46400 1727204554.08329: done queuing things up, now waiting for results queue to drain 46400 1727204554.08331: waiting for pending results... 46400 1727204554.09052: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204554.09169: in run() - task 0affcd87-79f5-1303-fda8-000000000a4f 46400 1727204554.09183: variable 'ansible_search_path' from source: unknown 46400 1727204554.09186: variable 'ansible_search_path' from source: unknown 46400 1727204554.09505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204554.12537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204554.12615: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204554.12651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204554.12698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204554.12723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204554.12830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204554.12858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204554.12893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204554.12943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204554.12956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204554.13117: dumping result to json 46400 1727204554.13121: done dumping result, returning 46400 1727204554.13127: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-000000000a4f] 46400 1727204554.13134: sending task result for task 0affcd87-79f5-1303-fda8-000000000a4f skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 46400 1727204554.13293: no more pending results, returning what we have 46400 1727204554.13297: results queue empty 46400 1727204554.13298: checking for any_errors_fatal 46400 1727204554.13308: done checking for any_errors_fatal 46400 1727204554.13309: checking for max_fail_percentage 46400 1727204554.13311: done checking for max_fail_percentage 46400 1727204554.13312: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.13313: done checking to see if all hosts have failed 46400 1727204554.13314: getting the remaining hosts for this loop 46400 1727204554.13315: done getting the remaining hosts for this loop 46400 1727204554.13320: getting the next task for host managed-node2 46400 1727204554.13329: done getting next task for host managed-node2 46400 1727204554.13332: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204554.13335: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.13339: getting variables 46400 1727204554.13340: in VariableManager get_vars() 46400 1727204554.13385: Calling all_inventory to load vars for managed-node2 46400 1727204554.13388: Calling groups_inventory to load vars for managed-node2 46400 1727204554.13392: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.13405: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.13408: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.13413: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.13971: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a4f 46400 1727204554.13976: WORKER PROCESS EXITING 46400 1727204554.15278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.17541: done with get_vars() 46400 1727204554.17680: done getting variables 46400 1727204554.17745: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204554.17885: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.099) 0:00:44.463 ***** 46400 1727204554.17917: entering _queue_task() for managed-node2/debug 46400 1727204554.18730: worker is 1 (out of 1 available) 46400 1727204554.18743: exiting _queue_task() for managed-node2/debug 46400 1727204554.18756: done queuing things up, now waiting for results queue to drain 46400 1727204554.18758: waiting for pending results... 46400 1727204554.19067: running TaskExecutor() for managed-node2/TASK: Success in test 'I can activate an existing profile' 46400 1727204554.19180: in run() - task 0affcd87-79f5-1303-fda8-000000000a50 46400 1727204554.19193: variable 'ansible_search_path' from source: unknown 46400 1727204554.19196: variable 'ansible_search_path' from source: unknown 46400 1727204554.19235: calling self._execute() 46400 1727204554.19330: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.19339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.19348: variable 'omit' from source: magic vars 46400 1727204554.19757: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.19779: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.19785: variable 'omit' from source: magic vars 46400 1727204554.19827: variable 'omit' from source: magic vars 46400 1727204554.19940: variable 'lsr_description' from source: include params 46400 1727204554.19959: variable 'omit' from source: magic vars 46400 1727204554.20014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204554.20054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.20081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204554.20104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.20115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.20145: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.20148: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.20151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.20263: Set connection var ansible_shell_type to sh 46400 1727204554.20280: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.20284: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.20290: Set connection var ansible_connection to ssh 46400 1727204554.20294: Set connection var ansible_pipelining to False 46400 1727204554.20300: Set connection var ansible_timeout to 10 46400 1727204554.20331: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.20334: variable 'ansible_connection' from source: unknown 46400 1727204554.20337: variable 'ansible_module_compression' from source: unknown 46400 1727204554.20340: variable 'ansible_shell_type' from source: unknown 46400 1727204554.20342: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.20345: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.20347: variable 'ansible_pipelining' from source: unknown 46400 1727204554.20349: variable 'ansible_timeout' from source: unknown 46400 1727204554.20351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.20748: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.20759: variable 'omit' from source: magic vars 46400 1727204554.20770: starting attempt loop 46400 1727204554.20773: running the handler 46400 1727204554.20821: handler run complete 46400 1727204554.20834: attempt loop complete, returning result 46400 1727204554.20837: _execute() done 46400 1727204554.20841: dumping result to json 46400 1727204554.20843: done dumping result, returning 46400 1727204554.20850: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can activate an existing profile' [0affcd87-79f5-1303-fda8-000000000a50] 46400 1727204554.20855: sending task result for task 0affcd87-79f5-1303-fda8-000000000a50 46400 1727204554.20951: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a50 46400 1727204554.20956: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 46400 1727204554.21009: no more pending results, returning what we have 46400 1727204554.21013: results queue empty 46400 1727204554.21014: checking for any_errors_fatal 46400 1727204554.21021: done checking for any_errors_fatal 46400 1727204554.21022: checking for max_fail_percentage 46400 1727204554.21024: done checking for max_fail_percentage 46400 1727204554.21025: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.21025: done checking to see if all hosts have failed 46400 1727204554.21026: getting the remaining hosts for this loop 46400 1727204554.21028: done getting the remaining hosts for this loop 46400 1727204554.21032: getting the next task for host managed-node2 46400 1727204554.21041: done getting next task for host managed-node2 46400 1727204554.21044: ^ task is: TASK: Cleanup 46400 1727204554.21047: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.21051: getting variables 46400 1727204554.21053: in VariableManager get_vars() 46400 1727204554.21098: Calling all_inventory to load vars for managed-node2 46400 1727204554.21101: Calling groups_inventory to load vars for managed-node2 46400 1727204554.21105: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.21117: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.21119: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.21122: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.23967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.25669: done with get_vars() 46400 1727204554.25698: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.078) 0:00:44.542 ***** 46400 1727204554.25800: entering _queue_task() for managed-node2/include_tasks 46400 1727204554.26144: worker is 1 (out of 1 available) 46400 1727204554.26157: exiting _queue_task() for managed-node2/include_tasks 46400 1727204554.26176: done queuing things up, now waiting for results queue to drain 46400 1727204554.26178: waiting for pending results... 46400 1727204554.26471: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204554.26577: in run() - task 0affcd87-79f5-1303-fda8-000000000a54 46400 1727204554.26590: variable 'ansible_search_path' from source: unknown 46400 1727204554.26594: variable 'ansible_search_path' from source: unknown 46400 1727204554.26644: variable 'lsr_cleanup' from source: include params 46400 1727204554.26853: variable 'lsr_cleanup' from source: include params 46400 1727204554.26918: variable 'omit' from source: magic vars 46400 1727204554.27055: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.27067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.27079: variable 'omit' from source: magic vars 46400 1727204554.27336: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.27346: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.27352: variable 'item' from source: unknown 46400 1727204554.27427: variable 'item' from source: unknown 46400 1727204554.27463: variable 'item' from source: unknown 46400 1727204554.27532: variable 'item' from source: unknown 46400 1727204554.27657: dumping result to json 46400 1727204554.27661: done dumping result, returning 46400 1727204554.27670: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-000000000a54] 46400 1727204554.27673: sending task result for task 0affcd87-79f5-1303-fda8-000000000a54 46400 1727204554.27711: done sending task result for task 0affcd87-79f5-1303-fda8-000000000a54 46400 1727204554.27714: WORKER PROCESS EXITING 46400 1727204554.27800: no more pending results, returning what we have 46400 1727204554.27805: in VariableManager get_vars() 46400 1727204554.27851: Calling all_inventory to load vars for managed-node2 46400 1727204554.27854: Calling groups_inventory to load vars for managed-node2 46400 1727204554.27858: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.27879: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.27882: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.27886: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.29496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.31155: done with get_vars() 46400 1727204554.31185: variable 'ansible_search_path' from source: unknown 46400 1727204554.31186: variable 'ansible_search_path' from source: unknown 46400 1727204554.31227: we have included files to process 46400 1727204554.31228: generating all_blocks data 46400 1727204554.31230: done generating all_blocks data 46400 1727204554.31237: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204554.31239: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204554.31241: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204554.31446: done processing included file 46400 1727204554.31448: iterating over new_blocks loaded from include file 46400 1727204554.31450: in VariableManager get_vars() 46400 1727204554.31470: done with get_vars() 46400 1727204554.31472: filtering new block on tags 46400 1727204554.31498: done filtering new block on tags 46400 1727204554.31501: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204554.31506: extending task lists for all hosts with included blocks 46400 1727204554.32780: done extending task lists 46400 1727204554.32782: done processing included files 46400 1727204554.32783: results queue empty 46400 1727204554.32784: checking for any_errors_fatal 46400 1727204554.32787: done checking for any_errors_fatal 46400 1727204554.32788: checking for max_fail_percentage 46400 1727204554.32789: done checking for max_fail_percentage 46400 1727204554.32790: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.32791: done checking to see if all hosts have failed 46400 1727204554.32792: getting the remaining hosts for this loop 46400 1727204554.32793: done getting the remaining hosts for this loop 46400 1727204554.32795: getting the next task for host managed-node2 46400 1727204554.32800: done getting next task for host managed-node2 46400 1727204554.32802: ^ task is: TASK: Cleanup profile and device 46400 1727204554.32805: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.32807: getting variables 46400 1727204554.32808: in VariableManager get_vars() 46400 1727204554.32821: Calling all_inventory to load vars for managed-node2 46400 1727204554.32823: Calling groups_inventory to load vars for managed-node2 46400 1727204554.32825: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.32831: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.32834: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.32836: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.34184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.35856: done with get_vars() 46400 1727204554.35889: done getting variables 46400 1727204554.35941: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.101) 0:00:44.644 ***** 46400 1727204554.35982: entering _queue_task() for managed-node2/shell 46400 1727204554.36340: worker is 1 (out of 1 available) 46400 1727204554.36352: exiting _queue_task() for managed-node2/shell 46400 1727204554.36371: done queuing things up, now waiting for results queue to drain 46400 1727204554.36372: waiting for pending results... 46400 1727204554.36660: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204554.36753: in run() - task 0affcd87-79f5-1303-fda8-000000000f6d 46400 1727204554.36772: variable 'ansible_search_path' from source: unknown 46400 1727204554.36777: variable 'ansible_search_path' from source: unknown 46400 1727204554.36811: calling self._execute() 46400 1727204554.36914: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.36921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.36937: variable 'omit' from source: magic vars 46400 1727204554.37320: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.37331: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.37338: variable 'omit' from source: magic vars 46400 1727204554.37390: variable 'omit' from source: magic vars 46400 1727204554.37537: variable 'interface' from source: play vars 46400 1727204554.37558: variable 'omit' from source: magic vars 46400 1727204554.37609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204554.37643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.37668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204554.37685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.37702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.37729: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.37733: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.37735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.37837: Set connection var ansible_shell_type to sh 46400 1727204554.37846: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.37852: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.37857: Set connection var ansible_connection to ssh 46400 1727204554.37867: Set connection var ansible_pipelining to False 46400 1727204554.37872: Set connection var ansible_timeout to 10 46400 1727204554.37896: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.37899: variable 'ansible_connection' from source: unknown 46400 1727204554.37901: variable 'ansible_module_compression' from source: unknown 46400 1727204554.37904: variable 'ansible_shell_type' from source: unknown 46400 1727204554.37911: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.37914: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.37918: variable 'ansible_pipelining' from source: unknown 46400 1727204554.37921: variable 'ansible_timeout' from source: unknown 46400 1727204554.37925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.38071: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.38083: variable 'omit' from source: magic vars 46400 1727204554.38087: starting attempt loop 46400 1727204554.38089: running the handler 46400 1727204554.38101: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.38120: _low_level_execute_command(): starting 46400 1727204554.38132: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204554.38937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204554.38950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.38963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.38981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.39025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.39032: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204554.39042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.39055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204554.39067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204554.39076: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204554.39085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.39094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.39107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.39116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.39124: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204554.39138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.39215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.39234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204554.39252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.39328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.41109: stdout chunk (state=3): >>>/root <<< 46400 1727204554.41296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204554.41299: stdout chunk (state=3): >>><<< 46400 1727204554.41311: stderr chunk (state=3): >>><<< 46400 1727204554.41340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204554.41356: _low_level_execute_command(): starting 46400 1727204554.41363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811 `" && echo ansible-tmp-1727204554.4134023-49539-107563965936811="` echo /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811 `" ) && sleep 0' 46400 1727204554.42540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.42544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.42590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.42593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204554.42596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204554.42598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.42656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.42670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.42733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.44584: stdout chunk (state=3): >>>ansible-tmp-1727204554.4134023-49539-107563965936811=/root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811 <<< 46400 1727204554.44693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204554.44770: stderr chunk (state=3): >>><<< 46400 1727204554.44773: stdout chunk (state=3): >>><<< 46400 1727204554.44972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204554.4134023-49539-107563965936811=/root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204554.44975: variable 'ansible_module_compression' from source: unknown 46400 1727204554.44978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204554.44980: variable 'ansible_facts' from source: unknown 46400 1727204554.45026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/AnsiballZ_command.py 46400 1727204554.45400: Sending initial data 46400 1727204554.45412: Sent initial data (156 bytes) 46400 1727204554.47925: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204554.47948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.47973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.47998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.48112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.48128: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204554.48154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.48176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204554.48189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204554.48201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204554.48213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.48228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.48250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.48341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.48357: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204554.48374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.48460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.48524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204554.48542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.48615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.50327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204554.50370: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204554.50404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpum5cq21k /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/AnsiballZ_command.py <<< 46400 1727204554.50442: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204554.51795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204554.51879: stderr chunk (state=3): >>><<< 46400 1727204554.51882: stdout chunk (state=3): >>><<< 46400 1727204554.51905: done transferring module to remote 46400 1727204554.51915: _low_level_execute_command(): starting 46400 1727204554.51920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/ /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/AnsiballZ_command.py && sleep 0' 46400 1727204554.53508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204554.53537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.53547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.53651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.53695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.53703: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204554.53714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.53727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204554.53735: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204554.53747: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204554.53756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.53769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.53782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.53790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.53797: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204554.53808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.53998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.54013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204554.54016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.54191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.55923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204554.55926: stdout chunk (state=3): >>><<< 46400 1727204554.55934: stderr chunk (state=3): >>><<< 46400 1727204554.55958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204554.55962: _low_level_execute_command(): starting 46400 1727204554.55971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/AnsiballZ_command.py && sleep 0' 46400 1727204554.57834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204554.57920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.57929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.57943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.57987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.57994: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204554.58005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.58022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204554.58030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204554.58036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204554.58044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.58054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.58070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.58132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204554.58138: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204554.58148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.58225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.58355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204554.58360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.58571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.79366: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (985b3c37-4ecd-406f-bfdf-7018e6e80d39) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:34.721480", "end": "2024-09-24 15:02:34.792663", "delta": "0:00:00.071183", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204554.80542: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204554.80604: stderr chunk (state=3): >>><<< 46400 1727204554.80608: stdout chunk (state=3): >>><<< 46400 1727204554.80624: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (985b3c37-4ecd-406f-bfdf-7018e6e80d39) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:02:34.721480", "end": "2024-09-24 15:02:34.792663", "delta": "0:00:00.071183", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204554.80654: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204554.80667: _low_level_execute_command(): starting 46400 1727204554.80672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204554.4134023-49539-107563965936811/ > /dev/null 2>&1 && sleep 0' 46400 1727204554.81142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204554.81146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204554.81185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204554.81188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204554.81190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204554.81193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204554.81250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204554.81253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204554.81255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204554.81296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204554.83081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204554.83137: stderr chunk (state=3): >>><<< 46400 1727204554.83140: stdout chunk (state=3): >>><<< 46400 1727204554.83157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204554.83169: handler run complete 46400 1727204554.83187: Evaluated conditional (False): False 46400 1727204554.83194: attempt loop complete, returning result 46400 1727204554.83197: _execute() done 46400 1727204554.83199: dumping result to json 46400 1727204554.83204: done dumping result, returning 46400 1727204554.83211: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-000000000f6d] 46400 1727204554.83216: sending task result for task 0affcd87-79f5-1303-fda8-000000000f6d 46400 1727204554.83314: done sending task result for task 0affcd87-79f5-1303-fda8-000000000f6d 46400 1727204554.83317: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.071183", "end": "2024-09-24 15:02:34.792663", "rc": 1, "start": "2024-09-24 15:02:34.721480" } STDOUT: Connection 'statebr' (985b3c37-4ecd-406f-bfdf-7018e6e80d39) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204554.83378: no more pending results, returning what we have 46400 1727204554.83382: results queue empty 46400 1727204554.83383: checking for any_errors_fatal 46400 1727204554.83384: done checking for any_errors_fatal 46400 1727204554.83385: checking for max_fail_percentage 46400 1727204554.83387: done checking for max_fail_percentage 46400 1727204554.83387: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.83388: done checking to see if all hosts have failed 46400 1727204554.83389: getting the remaining hosts for this loop 46400 1727204554.83390: done getting the remaining hosts for this loop 46400 1727204554.83394: getting the next task for host managed-node2 46400 1727204554.83406: done getting next task for host managed-node2 46400 1727204554.83409: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204554.83411: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.83415: getting variables 46400 1727204554.83417: in VariableManager get_vars() 46400 1727204554.83453: Calling all_inventory to load vars for managed-node2 46400 1727204554.83455: Calling groups_inventory to load vars for managed-node2 46400 1727204554.83459: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.83478: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.83481: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.83484: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.84325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.85367: done with get_vars() 46400 1727204554.85383: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.494) 0:00:45.139 ***** 46400 1727204554.85454: entering _queue_task() for managed-node2/include_tasks 46400 1727204554.85695: worker is 1 (out of 1 available) 46400 1727204554.85709: exiting _queue_task() for managed-node2/include_tasks 46400 1727204554.85722: done queuing things up, now waiting for results queue to drain 46400 1727204554.85723: waiting for pending results... 46400 1727204554.85904: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204554.85969: in run() - task 0affcd87-79f5-1303-fda8-000000000013 46400 1727204554.85976: variable 'ansible_search_path' from source: unknown 46400 1727204554.86006: calling self._execute() 46400 1727204554.86085: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.86089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.86098: variable 'omit' from source: magic vars 46400 1727204554.86380: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.86391: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.86399: _execute() done 46400 1727204554.86402: dumping result to json 46400 1727204554.86406: done dumping result, returning 46400 1727204554.86409: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-000000000013] 46400 1727204554.86414: sending task result for task 0affcd87-79f5-1303-fda8-000000000013 46400 1727204554.86513: done sending task result for task 0affcd87-79f5-1303-fda8-000000000013 46400 1727204554.86516: WORKER PROCESS EXITING 46400 1727204554.86545: no more pending results, returning what we have 46400 1727204554.86550: in VariableManager get_vars() 46400 1727204554.86595: Calling all_inventory to load vars for managed-node2 46400 1727204554.86598: Calling groups_inventory to load vars for managed-node2 46400 1727204554.86602: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.86613: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.86616: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.86618: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.87432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.88344: done with get_vars() 46400 1727204554.88365: variable 'ansible_search_path' from source: unknown 46400 1727204554.88377: we have included files to process 46400 1727204554.88378: generating all_blocks data 46400 1727204554.88379: done generating all_blocks data 46400 1727204554.88384: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204554.88385: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204554.88386: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204554.88659: in VariableManager get_vars() 46400 1727204554.88676: done with get_vars() 46400 1727204554.88705: in VariableManager get_vars() 46400 1727204554.88717: done with get_vars() 46400 1727204554.88743: in VariableManager get_vars() 46400 1727204554.88753: done with get_vars() 46400 1727204554.88783: in VariableManager get_vars() 46400 1727204554.88795: done with get_vars() 46400 1727204554.88822: in VariableManager get_vars() 46400 1727204554.88833: done with get_vars() 46400 1727204554.89104: in VariableManager get_vars() 46400 1727204554.89116: done with get_vars() 46400 1727204554.89125: done processing included file 46400 1727204554.89127: iterating over new_blocks loaded from include file 46400 1727204554.89128: in VariableManager get_vars() 46400 1727204554.89136: done with get_vars() 46400 1727204554.89137: filtering new block on tags 46400 1727204554.89203: done filtering new block on tags 46400 1727204554.89205: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204554.89209: extending task lists for all hosts with included blocks 46400 1727204554.89233: done extending task lists 46400 1727204554.89234: done processing included files 46400 1727204554.89235: results queue empty 46400 1727204554.89235: checking for any_errors_fatal 46400 1727204554.89242: done checking for any_errors_fatal 46400 1727204554.89242: checking for max_fail_percentage 46400 1727204554.89243: done checking for max_fail_percentage 46400 1727204554.89243: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.89244: done checking to see if all hosts have failed 46400 1727204554.89245: getting the remaining hosts for this loop 46400 1727204554.89245: done getting the remaining hosts for this loop 46400 1727204554.89247: getting the next task for host managed-node2 46400 1727204554.89250: done getting next task for host managed-node2 46400 1727204554.89251: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204554.89253: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.89254: getting variables 46400 1727204554.89255: in VariableManager get_vars() 46400 1727204554.89265: Calling all_inventory to load vars for managed-node2 46400 1727204554.89267: Calling groups_inventory to load vars for managed-node2 46400 1727204554.89268: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.89272: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.89274: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.89276: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.89995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.90895: done with get_vars() 46400 1727204554.90910: done getting variables 46400 1727204554.90939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204554.91026: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.055) 0:00:45.195 ***** 46400 1727204554.91048: entering _queue_task() for managed-node2/debug 46400 1727204554.91300: worker is 1 (out of 1 available) 46400 1727204554.91314: exiting _queue_task() for managed-node2/debug 46400 1727204554.91328: done queuing things up, now waiting for results queue to drain 46400 1727204554.91330: waiting for pending results... 46400 1727204554.91512: running TaskExecutor() for managed-node2/TASK: TEST: I can remove an existing profile without taking it down 46400 1727204554.91595: in run() - task 0affcd87-79f5-1303-fda8-000000001005 46400 1727204554.91606: variable 'ansible_search_path' from source: unknown 46400 1727204554.91609: variable 'ansible_search_path' from source: unknown 46400 1727204554.91645: calling self._execute() 46400 1727204554.91724: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.91732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.91740: variable 'omit' from source: magic vars 46400 1727204554.92016: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.92026: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.92032: variable 'omit' from source: magic vars 46400 1727204554.92064: variable 'omit' from source: magic vars 46400 1727204554.92135: variable 'lsr_description' from source: include params 46400 1727204554.92149: variable 'omit' from source: magic vars 46400 1727204554.92189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204554.92217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.92235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204554.92247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.92258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.92287: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.92290: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.92293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.92359: Set connection var ansible_shell_type to sh 46400 1727204554.92371: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.92377: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.92384: Set connection var ansible_connection to ssh 46400 1727204554.92387: Set connection var ansible_pipelining to False 46400 1727204554.92392: Set connection var ansible_timeout to 10 46400 1727204554.92411: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.92414: variable 'ansible_connection' from source: unknown 46400 1727204554.92418: variable 'ansible_module_compression' from source: unknown 46400 1727204554.92421: variable 'ansible_shell_type' from source: unknown 46400 1727204554.92423: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.92425: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.92427: variable 'ansible_pipelining' from source: unknown 46400 1727204554.92429: variable 'ansible_timeout' from source: unknown 46400 1727204554.92431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.92539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.92548: variable 'omit' from source: magic vars 46400 1727204554.92553: starting attempt loop 46400 1727204554.92556: running the handler 46400 1727204554.92596: handler run complete 46400 1727204554.92607: attempt loop complete, returning result 46400 1727204554.92610: _execute() done 46400 1727204554.92613: dumping result to json 46400 1727204554.92615: done dumping result, returning 46400 1727204554.92621: done running TaskExecutor() for managed-node2/TASK: TEST: I can remove an existing profile without taking it down [0affcd87-79f5-1303-fda8-000000001005] 46400 1727204554.92626: sending task result for task 0affcd87-79f5-1303-fda8-000000001005 46400 1727204554.92717: done sending task result for task 0affcd87-79f5-1303-fda8-000000001005 46400 1727204554.92720: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can remove an existing profile without taking it down ########## 46400 1727204554.92766: no more pending results, returning what we have 46400 1727204554.92771: results queue empty 46400 1727204554.92772: checking for any_errors_fatal 46400 1727204554.92774: done checking for any_errors_fatal 46400 1727204554.92774: checking for max_fail_percentage 46400 1727204554.92776: done checking for max_fail_percentage 46400 1727204554.92777: checking to see if all hosts have failed and the running result is not ok 46400 1727204554.92778: done checking to see if all hosts have failed 46400 1727204554.92779: getting the remaining hosts for this loop 46400 1727204554.92780: done getting the remaining hosts for this loop 46400 1727204554.92784: getting the next task for host managed-node2 46400 1727204554.92792: done getting next task for host managed-node2 46400 1727204554.92795: ^ task is: TASK: Show item 46400 1727204554.92798: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204554.92805: getting variables 46400 1727204554.92807: in VariableManager get_vars() 46400 1727204554.92847: Calling all_inventory to load vars for managed-node2 46400 1727204554.92850: Calling groups_inventory to load vars for managed-node2 46400 1727204554.92854: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204554.92867: Calling all_plugins_play to load vars for managed-node2 46400 1727204554.92870: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204554.92872: Calling groups_plugins_play to load vars for managed-node2 46400 1727204554.93709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204554.94668: done with get_vars() 46400 1727204554.94688: done getting variables 46400 1727204554.94732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:02:34 -0400 (0:00:00.037) 0:00:45.232 ***** 46400 1727204554.94756: entering _queue_task() for managed-node2/debug 46400 1727204554.95003: worker is 1 (out of 1 available) 46400 1727204554.95017: exiting _queue_task() for managed-node2/debug 46400 1727204554.95031: done queuing things up, now waiting for results queue to drain 46400 1727204554.95033: waiting for pending results... 46400 1727204554.95261: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204554.95376: in run() - task 0affcd87-79f5-1303-fda8-000000001006 46400 1727204554.95399: variable 'ansible_search_path' from source: unknown 46400 1727204554.95407: variable 'ansible_search_path' from source: unknown 46400 1727204554.95470: variable 'omit' from source: magic vars 46400 1727204554.95634: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.95650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.95669: variable 'omit' from source: magic vars 46400 1727204554.96039: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.96058: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.96072: variable 'omit' from source: magic vars 46400 1727204554.96112: variable 'omit' from source: magic vars 46400 1727204554.96166: variable 'item' from source: unknown 46400 1727204554.96239: variable 'item' from source: unknown 46400 1727204554.96269: variable 'omit' from source: magic vars 46400 1727204554.96317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204554.96359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.96388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204554.96410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.96427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.96481: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.96490: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.96501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.96600: Set connection var ansible_shell_type to sh 46400 1727204554.96608: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.96613: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.96618: Set connection var ansible_connection to ssh 46400 1727204554.96623: Set connection var ansible_pipelining to False 46400 1727204554.96628: Set connection var ansible_timeout to 10 46400 1727204554.96648: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.96652: variable 'ansible_connection' from source: unknown 46400 1727204554.96654: variable 'ansible_module_compression' from source: unknown 46400 1727204554.96657: variable 'ansible_shell_type' from source: unknown 46400 1727204554.96659: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.96666: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.96668: variable 'ansible_pipelining' from source: unknown 46400 1727204554.96671: variable 'ansible_timeout' from source: unknown 46400 1727204554.96673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.96781: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.96789: variable 'omit' from source: magic vars 46400 1727204554.96794: starting attempt loop 46400 1727204554.96796: running the handler 46400 1727204554.96833: variable 'lsr_description' from source: include params 46400 1727204554.96886: variable 'lsr_description' from source: include params 46400 1727204554.96895: handler run complete 46400 1727204554.96909: attempt loop complete, returning result 46400 1727204554.96923: variable 'item' from source: unknown 46400 1727204554.96971: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 46400 1727204554.97121: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.97125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.97127: variable 'omit' from source: magic vars 46400 1727204554.97194: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.97198: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.97203: variable 'omit' from source: magic vars 46400 1727204554.97214: variable 'omit' from source: magic vars 46400 1727204554.97246: variable 'item' from source: unknown 46400 1727204554.97297: variable 'item' from source: unknown 46400 1727204554.97308: variable 'omit' from source: magic vars 46400 1727204554.97323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.97330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.97336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.97345: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.97352: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.97359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.97409: Set connection var ansible_shell_type to sh 46400 1727204554.97415: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.97420: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.97424: Set connection var ansible_connection to ssh 46400 1727204554.97429: Set connection var ansible_pipelining to False 46400 1727204554.97434: Set connection var ansible_timeout to 10 46400 1727204554.97450: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.97453: variable 'ansible_connection' from source: unknown 46400 1727204554.97455: variable 'ansible_module_compression' from source: unknown 46400 1727204554.97461: variable 'ansible_shell_type' from source: unknown 46400 1727204554.97470: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.97473: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.97475: variable 'ansible_pipelining' from source: unknown 46400 1727204554.97477: variable 'ansible_timeout' from source: unknown 46400 1727204554.97482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.97541: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.97548: variable 'omit' from source: magic vars 46400 1727204554.97551: starting attempt loop 46400 1727204554.97553: running the handler 46400 1727204554.97581: variable 'lsr_setup' from source: include params 46400 1727204554.97627: variable 'lsr_setup' from source: include params 46400 1727204554.97667: handler run complete 46400 1727204554.97680: attempt loop complete, returning result 46400 1727204554.97692: variable 'item' from source: unknown 46400 1727204554.97736: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 46400 1727204554.97837: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.97841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.97843: variable 'omit' from source: magic vars 46400 1727204554.97936: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.97942: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.97948: variable 'omit' from source: magic vars 46400 1727204554.97957: variable 'omit' from source: magic vars 46400 1727204554.97985: variable 'item' from source: unknown 46400 1727204554.98029: variable 'item' from source: unknown 46400 1727204554.98041: variable 'omit' from source: magic vars 46400 1727204554.98057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.98065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.98068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.98078: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.98080: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.98083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.98127: Set connection var ansible_shell_type to sh 46400 1727204554.98133: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.98138: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.98143: Set connection var ansible_connection to ssh 46400 1727204554.98148: Set connection var ansible_pipelining to False 46400 1727204554.98152: Set connection var ansible_timeout to 10 46400 1727204554.98173: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.98177: variable 'ansible_connection' from source: unknown 46400 1727204554.98179: variable 'ansible_module_compression' from source: unknown 46400 1727204554.98181: variable 'ansible_shell_type' from source: unknown 46400 1727204554.98183: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.98186: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.98188: variable 'ansible_pipelining' from source: unknown 46400 1727204554.98190: variable 'ansible_timeout' from source: unknown 46400 1727204554.98192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.98252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.98258: variable 'omit' from source: magic vars 46400 1727204554.98263: starting attempt loop 46400 1727204554.98271: running the handler 46400 1727204554.98282: variable 'lsr_test' from source: include params 46400 1727204554.98330: variable 'lsr_test' from source: include params 46400 1727204554.98341: handler run complete 46400 1727204554.98351: attempt loop complete, returning result 46400 1727204554.98366: variable 'item' from source: unknown 46400 1727204554.98440: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 46400 1727204554.99096: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.99100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.99102: variable 'omit' from source: magic vars 46400 1727204554.99205: variable 'ansible_distribution_major_version' from source: facts 46400 1727204554.99208: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204554.99210: variable 'omit' from source: magic vars 46400 1727204554.99212: variable 'omit' from source: magic vars 46400 1727204554.99255: variable 'item' from source: unknown 46400 1727204554.99326: variable 'item' from source: unknown 46400 1727204554.99344: variable 'omit' from source: magic vars 46400 1727204554.99369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204554.99382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.99391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204554.99406: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204554.99413: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.99424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.99497: Set connection var ansible_shell_type to sh 46400 1727204554.99510: Set connection var ansible_shell_executable to /bin/sh 46400 1727204554.99519: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204554.99531: Set connection var ansible_connection to ssh 46400 1727204554.99540: Set connection var ansible_pipelining to False 46400 1727204554.99548: Set connection var ansible_timeout to 10 46400 1727204554.99574: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.99581: variable 'ansible_connection' from source: unknown 46400 1727204554.99587: variable 'ansible_module_compression' from source: unknown 46400 1727204554.99593: variable 'ansible_shell_type' from source: unknown 46400 1727204554.99599: variable 'ansible_shell_executable' from source: unknown 46400 1727204554.99606: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204554.99612: variable 'ansible_pipelining' from source: unknown 46400 1727204554.99618: variable 'ansible_timeout' from source: unknown 46400 1727204554.99625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204554.99719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204554.99730: variable 'omit' from source: magic vars 46400 1727204554.99739: starting attempt loop 46400 1727204554.99747: running the handler 46400 1727204554.99772: variable 'lsr_assert' from source: include params 46400 1727204554.99836: variable 'lsr_assert' from source: include params 46400 1727204554.99865: handler run complete 46400 1727204554.99884: attempt loop complete, returning result 46400 1727204554.99901: variable 'item' from source: unknown 46400 1727204554.99965: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 46400 1727204555.00127: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.00139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.00151: variable 'omit' from source: magic vars 46400 1727204555.00678: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.00694: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.00702: variable 'omit' from source: magic vars 46400 1727204555.00723: variable 'omit' from source: magic vars 46400 1727204555.00769: variable 'item' from source: unknown 46400 1727204555.00837: variable 'item' from source: unknown 46400 1727204555.00857: variable 'omit' from source: magic vars 46400 1727204555.00880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.00892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.00902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.00916: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.00923: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.00929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.01005: Set connection var ansible_shell_type to sh 46400 1727204555.01017: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.01026: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.01034: Set connection var ansible_connection to ssh 46400 1727204555.01043: Set connection var ansible_pipelining to False 46400 1727204555.01054: Set connection var ansible_timeout to 10 46400 1727204555.01080: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.01087: variable 'ansible_connection' from source: unknown 46400 1727204555.01093: variable 'ansible_module_compression' from source: unknown 46400 1727204555.01099: variable 'ansible_shell_type' from source: unknown 46400 1727204555.01104: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.01110: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.01116: variable 'ansible_pipelining' from source: unknown 46400 1727204555.01122: variable 'ansible_timeout' from source: unknown 46400 1727204555.01128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.01220: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.01231: variable 'omit' from source: magic vars 46400 1727204555.01240: starting attempt loop 46400 1727204555.01246: running the handler 46400 1727204555.01354: handler run complete 46400 1727204555.01376: attempt loop complete, returning result 46400 1727204555.01394: variable 'item' from source: unknown 46400 1727204555.01454: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 46400 1727204555.01607: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.01619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.01631: variable 'omit' from source: magic vars 46400 1727204555.01786: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.01797: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.01809: variable 'omit' from source: magic vars 46400 1727204555.01828: variable 'omit' from source: magic vars 46400 1727204555.01874: variable 'item' from source: unknown 46400 1727204555.01944: variable 'item' from source: unknown 46400 1727204555.01963: variable 'omit' from source: magic vars 46400 1727204555.01987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.01998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.02007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.02025: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.02032: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.02039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.02111: Set connection var ansible_shell_type to sh 46400 1727204555.02123: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.02136: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.02145: Set connection var ansible_connection to ssh 46400 1727204555.02153: Set connection var ansible_pipelining to False 46400 1727204555.02161: Set connection var ansible_timeout to 10 46400 1727204555.02186: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.02193: variable 'ansible_connection' from source: unknown 46400 1727204555.02200: variable 'ansible_module_compression' from source: unknown 46400 1727204555.02205: variable 'ansible_shell_type' from source: unknown 46400 1727204555.02211: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.02217: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.02223: variable 'ansible_pipelining' from source: unknown 46400 1727204555.02229: variable 'ansible_timeout' from source: unknown 46400 1727204555.02240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.02333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.02350: variable 'omit' from source: magic vars 46400 1727204555.02359: starting attempt loop 46400 1727204555.02367: running the handler 46400 1727204555.02390: variable 'lsr_fail_debug' from source: play vars 46400 1727204555.02454: variable 'lsr_fail_debug' from source: play vars 46400 1727204555.02479: handler run complete 46400 1727204555.02497: attempt loop complete, returning result 46400 1727204555.02523: variable 'item' from source: unknown 46400 1727204555.02580: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204555.02710: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.02722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.02733: variable 'omit' from source: magic vars 46400 1727204555.02899: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.02910: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.02918: variable 'omit' from source: magic vars 46400 1727204555.02936: variable 'omit' from source: magic vars 46400 1727204555.02982: variable 'item' from source: unknown 46400 1727204555.03051: variable 'item' from source: unknown 46400 1727204555.03073: variable 'omit' from source: magic vars 46400 1727204555.03096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.03110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.03129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.03144: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.03152: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.03159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.03235: Set connection var ansible_shell_type to sh 46400 1727204555.03249: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.03259: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.03270: Set connection var ansible_connection to ssh 46400 1727204555.03280: Set connection var ansible_pipelining to False 46400 1727204555.03289: Set connection var ansible_timeout to 10 46400 1727204555.03314: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.03321: variable 'ansible_connection' from source: unknown 46400 1727204555.03328: variable 'ansible_module_compression' from source: unknown 46400 1727204555.03338: variable 'ansible_shell_type' from source: unknown 46400 1727204555.03345: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.03352: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.03359: variable 'ansible_pipelining' from source: unknown 46400 1727204555.03367: variable 'ansible_timeout' from source: unknown 46400 1727204555.03376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.03470: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.03483: variable 'omit' from source: magic vars 46400 1727204555.03492: starting attempt loop 46400 1727204555.03498: running the handler 46400 1727204555.03520: variable 'lsr_cleanup' from source: include params 46400 1727204555.03593: variable 'lsr_cleanup' from source: include params 46400 1727204555.03615: handler run complete 46400 1727204555.03633: attempt loop complete, returning result 46400 1727204555.03654: variable 'item' from source: unknown 46400 1727204555.03723: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 46400 1727204555.03832: dumping result to json 46400 1727204555.03846: done dumping result, returning 46400 1727204555.03857: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-000000001006] 46400 1727204555.03871: sending task result for task 0affcd87-79f5-1303-fda8-000000001006 46400 1727204555.04008: no more pending results, returning what we have 46400 1727204555.04012: results queue empty 46400 1727204555.04013: checking for any_errors_fatal 46400 1727204555.04020: done checking for any_errors_fatal 46400 1727204555.04021: checking for max_fail_percentage 46400 1727204555.04022: done checking for max_fail_percentage 46400 1727204555.04024: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.04024: done checking to see if all hosts have failed 46400 1727204555.04025: getting the remaining hosts for this loop 46400 1727204555.04027: done getting the remaining hosts for this loop 46400 1727204555.04031: getting the next task for host managed-node2 46400 1727204555.04040: done getting next task for host managed-node2 46400 1727204555.04043: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204555.04046: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.04050: getting variables 46400 1727204555.04052: in VariableManager get_vars() 46400 1727204555.04093: Calling all_inventory to load vars for managed-node2 46400 1727204555.04096: Calling groups_inventory to load vars for managed-node2 46400 1727204555.04100: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.04113: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.04116: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.04119: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.05418: done sending task result for task 0affcd87-79f5-1303-fda8-000000001006 46400 1727204555.05422: WORKER PROCESS EXITING 46400 1727204555.06030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.07647: done with get_vars() 46400 1727204555.07681: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.130) 0:00:45.362 ***** 46400 1727204555.07784: entering _queue_task() for managed-node2/include_tasks 46400 1727204555.08132: worker is 1 (out of 1 available) 46400 1727204555.08145: exiting _queue_task() for managed-node2/include_tasks 46400 1727204555.08159: done queuing things up, now waiting for results queue to drain 46400 1727204555.08161: waiting for pending results... 46400 1727204555.08448: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204555.08575: in run() - task 0affcd87-79f5-1303-fda8-000000001007 46400 1727204555.08596: variable 'ansible_search_path' from source: unknown 46400 1727204555.08609: variable 'ansible_search_path' from source: unknown 46400 1727204555.08654: calling self._execute() 46400 1727204555.08759: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.08774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.08790: variable 'omit' from source: magic vars 46400 1727204555.09170: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.09187: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.09199: _execute() done 46400 1727204555.09206: dumping result to json 46400 1727204555.09213: done dumping result, returning 46400 1727204555.09223: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-000000001007] 46400 1727204555.09234: sending task result for task 0affcd87-79f5-1303-fda8-000000001007 46400 1727204555.09338: done sending task result for task 0affcd87-79f5-1303-fda8-000000001007 46400 1727204555.09345: WORKER PROCESS EXITING 46400 1727204555.09389: no more pending results, returning what we have 46400 1727204555.09395: in VariableManager get_vars() 46400 1727204555.09440: Calling all_inventory to load vars for managed-node2 46400 1727204555.09443: Calling groups_inventory to load vars for managed-node2 46400 1727204555.09447: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.09461: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.09465: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.09468: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.11254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.12897: done with get_vars() 46400 1727204555.12919: variable 'ansible_search_path' from source: unknown 46400 1727204555.12920: variable 'ansible_search_path' from source: unknown 46400 1727204555.12957: we have included files to process 46400 1727204555.12959: generating all_blocks data 46400 1727204555.12960: done generating all_blocks data 46400 1727204555.12968: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204555.12969: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204555.12971: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204555.13083: in VariableManager get_vars() 46400 1727204555.13107: done with get_vars() 46400 1727204555.13226: done processing included file 46400 1727204555.13228: iterating over new_blocks loaded from include file 46400 1727204555.13230: in VariableManager get_vars() 46400 1727204555.13247: done with get_vars() 46400 1727204555.13249: filtering new block on tags 46400 1727204555.13290: done filtering new block on tags 46400 1727204555.13293: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204555.13299: extending task lists for all hosts with included blocks 46400 1727204555.13769: done extending task lists 46400 1727204555.13770: done processing included files 46400 1727204555.13771: results queue empty 46400 1727204555.13772: checking for any_errors_fatal 46400 1727204555.13778: done checking for any_errors_fatal 46400 1727204555.13779: checking for max_fail_percentage 46400 1727204555.13781: done checking for max_fail_percentage 46400 1727204555.13781: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.13782: done checking to see if all hosts have failed 46400 1727204555.13783: getting the remaining hosts for this loop 46400 1727204555.13785: done getting the remaining hosts for this loop 46400 1727204555.13787: getting the next task for host managed-node2 46400 1727204555.13792: done getting next task for host managed-node2 46400 1727204555.13794: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204555.13797: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.13800: getting variables 46400 1727204555.13801: in VariableManager get_vars() 46400 1727204555.13812: Calling all_inventory to load vars for managed-node2 46400 1727204555.13814: Calling groups_inventory to load vars for managed-node2 46400 1727204555.13817: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.13824: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.13827: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.13829: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.15083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.16713: done with get_vars() 46400 1727204555.16746: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.090) 0:00:45.452 ***** 46400 1727204555.16834: entering _queue_task() for managed-node2/include_tasks 46400 1727204555.17189: worker is 1 (out of 1 available) 46400 1727204555.17201: exiting _queue_task() for managed-node2/include_tasks 46400 1727204555.17215: done queuing things up, now waiting for results queue to drain 46400 1727204555.17217: waiting for pending results... 46400 1727204555.17508: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204555.17643: in run() - task 0affcd87-79f5-1303-fda8-00000000102e 46400 1727204555.17670: variable 'ansible_search_path' from source: unknown 46400 1727204555.17678: variable 'ansible_search_path' from source: unknown 46400 1727204555.17720: calling self._execute() 46400 1727204555.17820: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.17833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.17848: variable 'omit' from source: magic vars 46400 1727204555.18240: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.18257: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.18270: _execute() done 46400 1727204555.18278: dumping result to json 46400 1727204555.18285: done dumping result, returning 46400 1727204555.18295: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-00000000102e] 46400 1727204555.18306: sending task result for task 0affcd87-79f5-1303-fda8-00000000102e 46400 1727204555.18421: done sending task result for task 0affcd87-79f5-1303-fda8-00000000102e 46400 1727204555.18452: no more pending results, returning what we have 46400 1727204555.18458: in VariableManager get_vars() 46400 1727204555.18505: Calling all_inventory to load vars for managed-node2 46400 1727204555.18509: Calling groups_inventory to load vars for managed-node2 46400 1727204555.18513: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.18529: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.18532: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.18536: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.19657: WORKER PROCESS EXITING 46400 1727204555.20332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.22016: done with get_vars() 46400 1727204555.22047: variable 'ansible_search_path' from source: unknown 46400 1727204555.22049: variable 'ansible_search_path' from source: unknown 46400 1727204555.22092: we have included files to process 46400 1727204555.22093: generating all_blocks data 46400 1727204555.22095: done generating all_blocks data 46400 1727204555.22097: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204555.22098: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204555.22100: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204555.22369: done processing included file 46400 1727204555.22372: iterating over new_blocks loaded from include file 46400 1727204555.22374: in VariableManager get_vars() 46400 1727204555.22391: done with get_vars() 46400 1727204555.22393: filtering new block on tags 46400 1727204555.22431: done filtering new block on tags 46400 1727204555.22433: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204555.22439: extending task lists for all hosts with included blocks 46400 1727204555.22577: done extending task lists 46400 1727204555.22578: done processing included files 46400 1727204555.22579: results queue empty 46400 1727204555.22580: checking for any_errors_fatal 46400 1727204555.22584: done checking for any_errors_fatal 46400 1727204555.22584: checking for max_fail_percentage 46400 1727204555.22585: done checking for max_fail_percentage 46400 1727204555.22586: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.22587: done checking to see if all hosts have failed 46400 1727204555.22587: getting the remaining hosts for this loop 46400 1727204555.22588: done getting the remaining hosts for this loop 46400 1727204555.22591: getting the next task for host managed-node2 46400 1727204555.22595: done getting next task for host managed-node2 46400 1727204555.22597: ^ task is: TASK: Gather current interface info 46400 1727204555.22600: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.22602: getting variables 46400 1727204555.22602: in VariableManager get_vars() 46400 1727204555.22612: Calling all_inventory to load vars for managed-node2 46400 1727204555.22614: Calling groups_inventory to load vars for managed-node2 46400 1727204555.22617: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.22622: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.22624: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.22626: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.23882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.25602: done with get_vars() 46400 1727204555.25627: done getting variables 46400 1727204555.25675: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.088) 0:00:45.541 ***** 46400 1727204555.25708: entering _queue_task() for managed-node2/command 46400 1727204555.26050: worker is 1 (out of 1 available) 46400 1727204555.26062: exiting _queue_task() for managed-node2/command 46400 1727204555.26078: done queuing things up, now waiting for results queue to drain 46400 1727204555.26080: waiting for pending results... 46400 1727204555.26365: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204555.26503: in run() - task 0affcd87-79f5-1303-fda8-000000001069 46400 1727204555.26529: variable 'ansible_search_path' from source: unknown 46400 1727204555.26537: variable 'ansible_search_path' from source: unknown 46400 1727204555.26581: calling self._execute() 46400 1727204555.26685: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.26696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.26711: variable 'omit' from source: magic vars 46400 1727204555.27102: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.27120: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.27131: variable 'omit' from source: magic vars 46400 1727204555.27198: variable 'omit' from source: magic vars 46400 1727204555.27239: variable 'omit' from source: magic vars 46400 1727204555.27295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204555.27336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.27366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204555.27394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.27411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.27443: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.27453: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.27460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.27571: Set connection var ansible_shell_type to sh 46400 1727204555.27586: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.27596: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.27611: Set connection var ansible_connection to ssh 46400 1727204555.27621: Set connection var ansible_pipelining to False 46400 1727204555.27631: Set connection var ansible_timeout to 10 46400 1727204555.27660: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.27671: variable 'ansible_connection' from source: unknown 46400 1727204555.27679: variable 'ansible_module_compression' from source: unknown 46400 1727204555.27686: variable 'ansible_shell_type' from source: unknown 46400 1727204555.27692: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.27699: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.27706: variable 'ansible_pipelining' from source: unknown 46400 1727204555.27717: variable 'ansible_timeout' from source: unknown 46400 1727204555.27725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.27880: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.27898: variable 'omit' from source: magic vars 46400 1727204555.27908: starting attempt loop 46400 1727204555.27914: running the handler 46400 1727204555.27938: _low_level_execute_command(): starting 46400 1727204555.27951: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204555.28720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204555.28739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.28754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.28779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.28821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.28836: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204555.28851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.28873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204555.28885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204555.28895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204555.28907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.28920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.28934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.28949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.28960: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204555.28978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.29056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.29077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.29092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.29176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.30831: stdout chunk (state=3): >>>/root <<< 46400 1727204555.30987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204555.31047: stderr chunk (state=3): >>><<< 46400 1727204555.31050: stdout chunk (state=3): >>><<< 46400 1727204555.31172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204555.31176: _low_level_execute_command(): starting 46400 1727204555.31179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051 `" && echo ansible-tmp-1727204555.3107548-49580-136139068143051="` echo /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051 `" ) && sleep 0' 46400 1727204555.31835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204555.31857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.31878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.31898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.31944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.31959: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204555.31982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.32001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204555.32015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204555.32027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204555.32040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.32055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.32077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.32093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.32105: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204555.32119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.32205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.32230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.32248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.32329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.34198: stdout chunk (state=3): >>>ansible-tmp-1727204555.3107548-49580-136139068143051=/root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051 <<< 46400 1727204555.34304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204555.34410: stderr chunk (state=3): >>><<< 46400 1727204555.34421: stdout chunk (state=3): >>><<< 46400 1727204555.34577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204555.3107548-49580-136139068143051=/root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204555.34581: variable 'ansible_module_compression' from source: unknown 46400 1727204555.34584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204555.34685: variable 'ansible_facts' from source: unknown 46400 1727204555.34693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/AnsiballZ_command.py 46400 1727204555.34859: Sending initial data 46400 1727204555.34862: Sent initial data (156 bytes) 46400 1727204555.36276: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.36281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.36305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.36309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.36312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.36389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.36392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.36404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.36484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.38208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204555.38245: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204555.38280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp99f825yy /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/AnsiballZ_command.py <<< 46400 1727204555.38313: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204555.39388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204555.39568: stderr chunk (state=3): >>><<< 46400 1727204555.39572: stdout chunk (state=3): >>><<< 46400 1727204555.39597: done transferring module to remote 46400 1727204555.39607: _low_level_execute_command(): starting 46400 1727204555.39612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/ /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/AnsiballZ_command.py && sleep 0' 46400 1727204555.40303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204555.40313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.40323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.40340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.40385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.40393: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204555.40404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.40417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204555.40425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204555.40432: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204555.40440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.40450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.40462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.40477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.40484: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204555.40494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.40561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.40586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.40598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.40675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.42393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204555.42876: stderr chunk (state=3): >>><<< 46400 1727204555.42880: stdout chunk (state=3): >>><<< 46400 1727204555.42904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204555.42908: _low_level_execute_command(): starting 46400 1727204555.42910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/AnsiballZ_command.py && sleep 0' 46400 1727204555.43611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204555.43619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.43629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.43643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.43692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.43700: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204555.43708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.43722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204555.43729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204555.43736: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204555.43745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.43755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.43776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.43783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.43790: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204555.43799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.43874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.43897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.43909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.43989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.57579: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:35.571441", "end": "2024-09-24 15:02:35.574738", "delta": "0:00:00.003297", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204555.58796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204555.58888: stderr chunk (state=3): >>><<< 46400 1727204555.58892: stdout chunk (state=3): >>><<< 46400 1727204555.58936: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:02:35.571441", "end": "2024-09-24 15:02:35.574738", "delta": "0:00:00.003297", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204555.58980: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204555.58989: _low_level_execute_command(): starting 46400 1727204555.58994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204555.3107548-49580-136139068143051/ > /dev/null 2>&1 && sleep 0' 46400 1727204555.59758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204555.59773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.59784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.59799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.59842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.59858: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204555.59878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.59892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204555.59900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204555.59907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204555.59915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204555.59924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204555.59936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204555.59944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204555.59950: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204555.59974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204555.60044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204555.60068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204555.60091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204555.60161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204555.61983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204555.62129: stderr chunk (state=3): >>><<< 46400 1727204555.62133: stdout chunk (state=3): >>><<< 46400 1727204555.62158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204555.62171: handler run complete 46400 1727204555.62199: Evaluated conditional (False): False 46400 1727204555.62210: attempt loop complete, returning result 46400 1727204555.62213: _execute() done 46400 1727204555.62216: dumping result to json 46400 1727204555.62221: done dumping result, returning 46400 1727204555.62230: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-000000001069] 46400 1727204555.62236: sending task result for task 0affcd87-79f5-1303-fda8-000000001069 46400 1727204555.62404: done sending task result for task 0affcd87-79f5-1303-fda8-000000001069 46400 1727204555.62407: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003297", "end": "2024-09-24 15:02:35.574738", "rc": 0, "start": "2024-09-24 15:02:35.571441" } STDOUT: bonding_masters eth0 lo 46400 1727204555.62501: no more pending results, returning what we have 46400 1727204555.62506: results queue empty 46400 1727204555.62508: checking for any_errors_fatal 46400 1727204555.62510: done checking for any_errors_fatal 46400 1727204555.62511: checking for max_fail_percentage 46400 1727204555.62513: done checking for max_fail_percentage 46400 1727204555.62514: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.62515: done checking to see if all hosts have failed 46400 1727204555.62516: getting the remaining hosts for this loop 46400 1727204555.62518: done getting the remaining hosts for this loop 46400 1727204555.62522: getting the next task for host managed-node2 46400 1727204555.62531: done getting next task for host managed-node2 46400 1727204555.62534: ^ task is: TASK: Set current_interfaces 46400 1727204555.62539: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.62545: getting variables 46400 1727204555.62547: in VariableManager get_vars() 46400 1727204555.62590: Calling all_inventory to load vars for managed-node2 46400 1727204555.62594: Calling groups_inventory to load vars for managed-node2 46400 1727204555.62598: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.62610: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.62613: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.62623: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.64332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.66014: done with get_vars() 46400 1727204555.66046: done getting variables 46400 1727204555.66111: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.404) 0:00:45.946 ***** 46400 1727204555.66146: entering _queue_task() for managed-node2/set_fact 46400 1727204555.66493: worker is 1 (out of 1 available) 46400 1727204555.66507: exiting _queue_task() for managed-node2/set_fact 46400 1727204555.66522: done queuing things up, now waiting for results queue to drain 46400 1727204555.66524: waiting for pending results... 46400 1727204555.66807: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204555.66947: in run() - task 0affcd87-79f5-1303-fda8-00000000106a 46400 1727204555.66978: variable 'ansible_search_path' from source: unknown 46400 1727204555.66987: variable 'ansible_search_path' from source: unknown 46400 1727204555.67033: calling self._execute() 46400 1727204555.67139: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.67152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.67169: variable 'omit' from source: magic vars 46400 1727204555.67561: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.67581: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.67594: variable 'omit' from source: magic vars 46400 1727204555.67666: variable 'omit' from source: magic vars 46400 1727204555.67780: variable '_current_interfaces' from source: set_fact 46400 1727204555.67853: variable 'omit' from source: magic vars 46400 1727204555.67905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204555.67953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.67983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204555.68007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.68023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.68065: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.68075: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.68084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.68191: Set connection var ansible_shell_type to sh 46400 1727204555.68206: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.68218: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.68228: Set connection var ansible_connection to ssh 46400 1727204555.68236: Set connection var ansible_pipelining to False 46400 1727204555.68246: Set connection var ansible_timeout to 10 46400 1727204555.68281: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.68290: variable 'ansible_connection' from source: unknown 46400 1727204555.68298: variable 'ansible_module_compression' from source: unknown 46400 1727204555.68304: variable 'ansible_shell_type' from source: unknown 46400 1727204555.68311: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.68317: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.68325: variable 'ansible_pipelining' from source: unknown 46400 1727204555.68332: variable 'ansible_timeout' from source: unknown 46400 1727204555.68339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.68494: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.68512: variable 'omit' from source: magic vars 46400 1727204555.68522: starting attempt loop 46400 1727204555.68529: running the handler 46400 1727204555.68546: handler run complete 46400 1727204555.68560: attempt loop complete, returning result 46400 1727204555.68569: _execute() done 46400 1727204555.68577: dumping result to json 46400 1727204555.68584: done dumping result, returning 46400 1727204555.68599: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-00000000106a] 46400 1727204555.68609: sending task result for task 0affcd87-79f5-1303-fda8-00000000106a ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204555.68768: no more pending results, returning what we have 46400 1727204555.68772: results queue empty 46400 1727204555.68773: checking for any_errors_fatal 46400 1727204555.68784: done checking for any_errors_fatal 46400 1727204555.68785: checking for max_fail_percentage 46400 1727204555.68787: done checking for max_fail_percentage 46400 1727204555.68788: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.68789: done checking to see if all hosts have failed 46400 1727204555.68789: getting the remaining hosts for this loop 46400 1727204555.68791: done getting the remaining hosts for this loop 46400 1727204555.68796: getting the next task for host managed-node2 46400 1727204555.68808: done getting next task for host managed-node2 46400 1727204555.68811: ^ task is: TASK: Show current_interfaces 46400 1727204555.68816: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.68822: getting variables 46400 1727204555.68823: in VariableManager get_vars() 46400 1727204555.68866: Calling all_inventory to load vars for managed-node2 46400 1727204555.68869: Calling groups_inventory to load vars for managed-node2 46400 1727204555.68873: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.68885: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.68888: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.68891: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.69996: done sending task result for task 0affcd87-79f5-1303-fda8-00000000106a 46400 1727204555.70000: WORKER PROCESS EXITING 46400 1727204555.75900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.77525: done with get_vars() 46400 1727204555.77558: done getting variables 46400 1727204555.77609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.114) 0:00:46.060 ***** 46400 1727204555.77638: entering _queue_task() for managed-node2/debug 46400 1727204555.77982: worker is 1 (out of 1 available) 46400 1727204555.77995: exiting _queue_task() for managed-node2/debug 46400 1727204555.78008: done queuing things up, now waiting for results queue to drain 46400 1727204555.78010: waiting for pending results... 46400 1727204555.78300: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204555.78438: in run() - task 0affcd87-79f5-1303-fda8-00000000102f 46400 1727204555.78466: variable 'ansible_search_path' from source: unknown 46400 1727204555.78475: variable 'ansible_search_path' from source: unknown 46400 1727204555.78515: calling self._execute() 46400 1727204555.78618: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.78631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.78644: variable 'omit' from source: magic vars 46400 1727204555.79133: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.79154: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.79167: variable 'omit' from source: magic vars 46400 1727204555.79224: variable 'omit' from source: magic vars 46400 1727204555.79335: variable 'current_interfaces' from source: set_fact 46400 1727204555.79373: variable 'omit' from source: magic vars 46400 1727204555.79424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204555.79508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204555.79605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204555.79627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.79643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204555.79679: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204555.79728: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.79736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.79841: Set connection var ansible_shell_type to sh 46400 1727204555.79856: Set connection var ansible_shell_executable to /bin/sh 46400 1727204555.79870: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204555.79880: Set connection var ansible_connection to ssh 46400 1727204555.79889: Set connection var ansible_pipelining to False 46400 1727204555.79898: Set connection var ansible_timeout to 10 46400 1727204555.79930: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.79937: variable 'ansible_connection' from source: unknown 46400 1727204555.79943: variable 'ansible_module_compression' from source: unknown 46400 1727204555.79948: variable 'ansible_shell_type' from source: unknown 46400 1727204555.79954: variable 'ansible_shell_executable' from source: unknown 46400 1727204555.79959: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.79971: variable 'ansible_pipelining' from source: unknown 46400 1727204555.79978: variable 'ansible_timeout' from source: unknown 46400 1727204555.79986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.80141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204555.80157: variable 'omit' from source: magic vars 46400 1727204555.80168: starting attempt loop 46400 1727204555.80175: running the handler 46400 1727204555.80229: handler run complete 46400 1727204555.80251: attempt loop complete, returning result 46400 1727204555.80259: _execute() done 46400 1727204555.80269: dumping result to json 46400 1727204555.80277: done dumping result, returning 46400 1727204555.80289: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-00000000102f] 46400 1727204555.80300: sending task result for task 0affcd87-79f5-1303-fda8-00000000102f ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204555.80448: no more pending results, returning what we have 46400 1727204555.80454: results queue empty 46400 1727204555.80455: checking for any_errors_fatal 46400 1727204555.80467: done checking for any_errors_fatal 46400 1727204555.80468: checking for max_fail_percentage 46400 1727204555.80470: done checking for max_fail_percentage 46400 1727204555.80471: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.80472: done checking to see if all hosts have failed 46400 1727204555.80473: getting the remaining hosts for this loop 46400 1727204555.80475: done getting the remaining hosts for this loop 46400 1727204555.80479: getting the next task for host managed-node2 46400 1727204555.80491: done getting next task for host managed-node2 46400 1727204555.80494: ^ task is: TASK: Setup 46400 1727204555.80497: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.80503: getting variables 46400 1727204555.80505: in VariableManager get_vars() 46400 1727204555.80545: Calling all_inventory to load vars for managed-node2 46400 1727204555.80548: Calling groups_inventory to load vars for managed-node2 46400 1727204555.80552: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.80565: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.80568: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.80571: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.82190: done sending task result for task 0affcd87-79f5-1303-fda8-00000000102f 46400 1727204555.82195: WORKER PROCESS EXITING 46400 1727204555.82676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.84525: done with get_vars() 46400 1727204555.84561: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.071) 0:00:46.132 ***** 46400 1727204555.84748: entering _queue_task() for managed-node2/include_tasks 46400 1727204555.85098: worker is 1 (out of 1 available) 46400 1727204555.85112: exiting _queue_task() for managed-node2/include_tasks 46400 1727204555.85127: done queuing things up, now waiting for results queue to drain 46400 1727204555.85128: waiting for pending results... 46400 1727204555.85426: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204555.85556: in run() - task 0affcd87-79f5-1303-fda8-000000001008 46400 1727204555.85581: variable 'ansible_search_path' from source: unknown 46400 1727204555.85590: variable 'ansible_search_path' from source: unknown 46400 1727204555.85644: variable 'lsr_setup' from source: include params 46400 1727204555.85875: variable 'lsr_setup' from source: include params 46400 1727204555.85958: variable 'omit' from source: magic vars 46400 1727204555.86111: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.86126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.86139: variable 'omit' from source: magic vars 46400 1727204555.86378: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.86391: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.86400: variable 'item' from source: unknown 46400 1727204555.86473: variable 'item' from source: unknown 46400 1727204555.86514: variable 'item' from source: unknown 46400 1727204555.86583: variable 'item' from source: unknown 46400 1727204555.86770: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.86783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.86796: variable 'omit' from source: magic vars 46400 1727204555.86944: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.86956: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.86967: variable 'item' from source: unknown 46400 1727204555.87031: variable 'item' from source: unknown 46400 1727204555.87075: variable 'item' from source: unknown 46400 1727204555.87136: variable 'item' from source: unknown 46400 1727204555.87228: dumping result to json 46400 1727204555.87239: done dumping result, returning 46400 1727204555.87250: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-000000001008] 46400 1727204555.87267: sending task result for task 0affcd87-79f5-1303-fda8-000000001008 46400 1727204555.87340: done sending task result for task 0affcd87-79f5-1303-fda8-000000001008 46400 1727204555.87348: WORKER PROCESS EXITING 46400 1727204555.87389: no more pending results, returning what we have 46400 1727204555.87395: in VariableManager get_vars() 46400 1727204555.87441: Calling all_inventory to load vars for managed-node2 46400 1727204555.87445: Calling groups_inventory to load vars for managed-node2 46400 1727204555.87449: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.87466: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.87470: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.87474: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.89353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.90606: done with get_vars() 46400 1727204555.90622: variable 'ansible_search_path' from source: unknown 46400 1727204555.90623: variable 'ansible_search_path' from source: unknown 46400 1727204555.90652: variable 'ansible_search_path' from source: unknown 46400 1727204555.90653: variable 'ansible_search_path' from source: unknown 46400 1727204555.90674: we have included files to process 46400 1727204555.90675: generating all_blocks data 46400 1727204555.90676: done generating all_blocks data 46400 1727204555.90679: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204555.90679: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204555.90681: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204555.90850: done processing included file 46400 1727204555.90851: iterating over new_blocks loaded from include file 46400 1727204555.90852: in VariableManager get_vars() 46400 1727204555.90866: done with get_vars() 46400 1727204555.90867: filtering new block on tags 46400 1727204555.90891: done filtering new block on tags 46400 1727204555.90893: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 46400 1727204555.90896: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204555.90897: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204555.90899: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204555.90959: done processing included file 46400 1727204555.90961: iterating over new_blocks loaded from include file 46400 1727204555.90962: in VariableManager get_vars() 46400 1727204555.90974: done with get_vars() 46400 1727204555.90975: filtering new block on tags 46400 1727204555.90989: done filtering new block on tags 46400 1727204555.90991: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 46400 1727204555.90993: extending task lists for all hosts with included blocks 46400 1727204555.91351: done extending task lists 46400 1727204555.91352: done processing included files 46400 1727204555.91353: results queue empty 46400 1727204555.91353: checking for any_errors_fatal 46400 1727204555.91356: done checking for any_errors_fatal 46400 1727204555.91356: checking for max_fail_percentage 46400 1727204555.91357: done checking for max_fail_percentage 46400 1727204555.91358: checking to see if all hosts have failed and the running result is not ok 46400 1727204555.91358: done checking to see if all hosts have failed 46400 1727204555.91359: getting the remaining hosts for this loop 46400 1727204555.91359: done getting the remaining hosts for this loop 46400 1727204555.91362: getting the next task for host managed-node2 46400 1727204555.91366: done getting next task for host managed-node2 46400 1727204555.91367: ^ task is: TASK: Include network role 46400 1727204555.91369: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204555.91371: getting variables 46400 1727204555.91371: in VariableManager get_vars() 46400 1727204555.91378: Calling all_inventory to load vars for managed-node2 46400 1727204555.91384: Calling groups_inventory to load vars for managed-node2 46400 1727204555.91385: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.91389: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.91391: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.91393: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.92261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.93656: done with get_vars() 46400 1727204555.93675: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:02:35 -0400 (0:00:00.089) 0:00:46.221 ***** 46400 1727204555.93736: entering _queue_task() for managed-node2/include_role 46400 1727204555.93984: worker is 1 (out of 1 available) 46400 1727204555.93996: exiting _queue_task() for managed-node2/include_role 46400 1727204555.94010: done queuing things up, now waiting for results queue to drain 46400 1727204555.94012: waiting for pending results... 46400 1727204555.94200: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204555.94292: in run() - task 0affcd87-79f5-1303-fda8-00000000108f 46400 1727204555.94303: variable 'ansible_search_path' from source: unknown 46400 1727204555.94306: variable 'ansible_search_path' from source: unknown 46400 1727204555.94335: calling self._execute() 46400 1727204555.94412: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204555.94416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204555.94425: variable 'omit' from source: magic vars 46400 1727204555.94715: variable 'ansible_distribution_major_version' from source: facts 46400 1727204555.94724: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204555.94730: _execute() done 46400 1727204555.94733: dumping result to json 46400 1727204555.94736: done dumping result, returning 46400 1727204555.94742: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-00000000108f] 46400 1727204555.94747: sending task result for task 0affcd87-79f5-1303-fda8-00000000108f 46400 1727204555.94857: done sending task result for task 0affcd87-79f5-1303-fda8-00000000108f 46400 1727204555.94862: WORKER PROCESS EXITING 46400 1727204555.94896: no more pending results, returning what we have 46400 1727204555.94901: in VariableManager get_vars() 46400 1727204555.94942: Calling all_inventory to load vars for managed-node2 46400 1727204555.94944: Calling groups_inventory to load vars for managed-node2 46400 1727204555.94948: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204555.94965: Calling all_plugins_play to load vars for managed-node2 46400 1727204555.94968: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204555.94972: Calling groups_plugins_play to load vars for managed-node2 46400 1727204555.96317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204555.97733: done with get_vars() 46400 1727204555.97757: variable 'ansible_search_path' from source: unknown 46400 1727204555.97759: variable 'ansible_search_path' from source: unknown 46400 1727204555.97889: variable 'omit' from source: magic vars 46400 1727204555.97919: variable 'omit' from source: magic vars 46400 1727204555.97929: variable 'omit' from source: magic vars 46400 1727204555.97933: we have included files to process 46400 1727204555.97933: generating all_blocks data 46400 1727204555.97934: done generating all_blocks data 46400 1727204555.97935: processing included file: fedora.linux_system_roles.network 46400 1727204555.97949: in VariableManager get_vars() 46400 1727204555.97960: done with get_vars() 46400 1727204555.97984: in VariableManager get_vars() 46400 1727204555.97996: done with get_vars() 46400 1727204555.98023: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204555.98103: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204555.98150: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204555.98431: in VariableManager get_vars() 46400 1727204555.98446: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204556.00127: iterating over new_blocks loaded from include file 46400 1727204556.00129: in VariableManager get_vars() 46400 1727204556.00147: done with get_vars() 46400 1727204556.00149: filtering new block on tags 46400 1727204556.00382: done filtering new block on tags 46400 1727204556.00385: in VariableManager get_vars() 46400 1727204556.00400: done with get_vars() 46400 1727204556.00401: filtering new block on tags 46400 1727204556.00418: done filtering new block on tags 46400 1727204556.00420: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204556.00425: extending task lists for all hosts with included blocks 46400 1727204556.00578: done extending task lists 46400 1727204556.00580: done processing included files 46400 1727204556.00581: results queue empty 46400 1727204556.00582: checking for any_errors_fatal 46400 1727204556.00586: done checking for any_errors_fatal 46400 1727204556.00587: checking for max_fail_percentage 46400 1727204556.00588: done checking for max_fail_percentage 46400 1727204556.00588: checking to see if all hosts have failed and the running result is not ok 46400 1727204556.00589: done checking to see if all hosts have failed 46400 1727204556.00590: getting the remaining hosts for this loop 46400 1727204556.00592: done getting the remaining hosts for this loop 46400 1727204556.00594: getting the next task for host managed-node2 46400 1727204556.00599: done getting next task for host managed-node2 46400 1727204556.00601: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204556.00604: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204556.00615: getting variables 46400 1727204556.00616: in VariableManager get_vars() 46400 1727204556.00628: Calling all_inventory to load vars for managed-node2 46400 1727204556.00630: Calling groups_inventory to load vars for managed-node2 46400 1727204556.00633: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.00638: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.00640: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.00642: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.01845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.02766: done with get_vars() 46400 1727204556.02783: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:36 -0400 (0:00:00.091) 0:00:46.312 ***** 46400 1727204556.02839: entering _queue_task() for managed-node2/include_tasks 46400 1727204556.03090: worker is 1 (out of 1 available) 46400 1727204556.03105: exiting _queue_task() for managed-node2/include_tasks 46400 1727204556.03118: done queuing things up, now waiting for results queue to drain 46400 1727204556.03120: waiting for pending results... 46400 1727204556.03315: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204556.03406: in run() - task 0affcd87-79f5-1303-fda8-0000000010f5 46400 1727204556.03423: variable 'ansible_search_path' from source: unknown 46400 1727204556.03427: variable 'ansible_search_path' from source: unknown 46400 1727204556.03457: calling self._execute() 46400 1727204556.03539: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.03544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.03704: variable 'omit' from source: magic vars 46400 1727204556.03934: variable 'ansible_distribution_major_version' from source: facts 46400 1727204556.03947: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204556.03959: _execute() done 46400 1727204556.03965: dumping result to json 46400 1727204556.03970: done dumping result, returning 46400 1727204556.03973: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-0000000010f5] 46400 1727204556.03978: sending task result for task 0affcd87-79f5-1303-fda8-0000000010f5 46400 1727204556.04080: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010f5 46400 1727204556.04084: WORKER PROCESS EXITING 46400 1727204556.04127: no more pending results, returning what we have 46400 1727204556.04131: in VariableManager get_vars() 46400 1727204556.04184: Calling all_inventory to load vars for managed-node2 46400 1727204556.04188: Calling groups_inventory to load vars for managed-node2 46400 1727204556.04190: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.04206: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.04210: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.04213: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.05505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.06549: done with get_vars() 46400 1727204556.06566: variable 'ansible_search_path' from source: unknown 46400 1727204556.06567: variable 'ansible_search_path' from source: unknown 46400 1727204556.06598: we have included files to process 46400 1727204556.06599: generating all_blocks data 46400 1727204556.06600: done generating all_blocks data 46400 1727204556.06602: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204556.06603: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204556.06604: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204556.07010: done processing included file 46400 1727204556.07012: iterating over new_blocks loaded from include file 46400 1727204556.07013: in VariableManager get_vars() 46400 1727204556.07032: done with get_vars() 46400 1727204556.07033: filtering new block on tags 46400 1727204556.07053: done filtering new block on tags 46400 1727204556.07054: in VariableManager get_vars() 46400 1727204556.07071: done with get_vars() 46400 1727204556.07072: filtering new block on tags 46400 1727204556.07101: done filtering new block on tags 46400 1727204556.07103: in VariableManager get_vars() 46400 1727204556.07116: done with get_vars() 46400 1727204556.07117: filtering new block on tags 46400 1727204556.07143: done filtering new block on tags 46400 1727204556.07145: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204556.07148: extending task lists for all hosts with included blocks 46400 1727204556.08176: done extending task lists 46400 1727204556.08177: done processing included files 46400 1727204556.08178: results queue empty 46400 1727204556.08178: checking for any_errors_fatal 46400 1727204556.08180: done checking for any_errors_fatal 46400 1727204556.08181: checking for max_fail_percentage 46400 1727204556.08182: done checking for max_fail_percentage 46400 1727204556.08182: checking to see if all hosts have failed and the running result is not ok 46400 1727204556.08183: done checking to see if all hosts have failed 46400 1727204556.08183: getting the remaining hosts for this loop 46400 1727204556.08184: done getting the remaining hosts for this loop 46400 1727204556.08186: getting the next task for host managed-node2 46400 1727204556.08189: done getting next task for host managed-node2 46400 1727204556.08192: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204556.08195: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204556.08203: getting variables 46400 1727204556.08204: in VariableManager get_vars() 46400 1727204556.08213: Calling all_inventory to load vars for managed-node2 46400 1727204556.08215: Calling groups_inventory to load vars for managed-node2 46400 1727204556.08217: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.08222: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.08223: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.08225: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.08898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.09817: done with get_vars() 46400 1727204556.09836: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:36 -0400 (0:00:00.070) 0:00:46.383 ***** 46400 1727204556.09900: entering _queue_task() for managed-node2/setup 46400 1727204556.10157: worker is 1 (out of 1 available) 46400 1727204556.10174: exiting _queue_task() for managed-node2/setup 46400 1727204556.10188: done queuing things up, now waiting for results queue to drain 46400 1727204556.10189: waiting for pending results... 46400 1727204556.10387: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204556.10491: in run() - task 0affcd87-79f5-1303-fda8-000000001152 46400 1727204556.10505: variable 'ansible_search_path' from source: unknown 46400 1727204556.10509: variable 'ansible_search_path' from source: unknown 46400 1727204556.10536: calling self._execute() 46400 1727204556.10612: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.10622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.10631: variable 'omit' from source: magic vars 46400 1727204556.10915: variable 'ansible_distribution_major_version' from source: facts 46400 1727204556.10925: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204556.11085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204556.12846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204556.12894: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204556.12922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204556.12948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204556.12970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204556.13030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204556.13051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204556.13074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204556.13100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204556.13111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204556.13149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204556.13170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204556.13187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204556.13212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204556.13224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204556.13335: variable '__network_required_facts' from source: role '' defaults 46400 1727204556.13341: variable 'ansible_facts' from source: unknown 46400 1727204556.13796: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204556.13800: when evaluation is False, skipping this task 46400 1727204556.13803: _execute() done 46400 1727204556.13805: dumping result to json 46400 1727204556.13808: done dumping result, returning 46400 1727204556.13814: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000001152] 46400 1727204556.13821: sending task result for task 0affcd87-79f5-1303-fda8-000000001152 46400 1727204556.13910: done sending task result for task 0affcd87-79f5-1303-fda8-000000001152 46400 1727204556.13912: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204556.13959: no more pending results, returning what we have 46400 1727204556.13963: results queue empty 46400 1727204556.13965: checking for any_errors_fatal 46400 1727204556.13967: done checking for any_errors_fatal 46400 1727204556.13968: checking for max_fail_percentage 46400 1727204556.13970: done checking for max_fail_percentage 46400 1727204556.13971: checking to see if all hosts have failed and the running result is not ok 46400 1727204556.13972: done checking to see if all hosts have failed 46400 1727204556.13972: getting the remaining hosts for this loop 46400 1727204556.13974: done getting the remaining hosts for this loop 46400 1727204556.13978: getting the next task for host managed-node2 46400 1727204556.13989: done getting next task for host managed-node2 46400 1727204556.13994: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204556.14000: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204556.14025: getting variables 46400 1727204556.14027: in VariableManager get_vars() 46400 1727204556.14073: Calling all_inventory to load vars for managed-node2 46400 1727204556.14076: Calling groups_inventory to load vars for managed-node2 46400 1727204556.14078: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.14088: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.14090: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.14098: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.15026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.15952: done with get_vars() 46400 1727204556.15970: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:36 -0400 (0:00:00.061) 0:00:46.445 ***** 46400 1727204556.16045: entering _queue_task() for managed-node2/stat 46400 1727204556.16276: worker is 1 (out of 1 available) 46400 1727204556.16290: exiting _queue_task() for managed-node2/stat 46400 1727204556.16303: done queuing things up, now waiting for results queue to drain 46400 1727204556.16305: waiting for pending results... 46400 1727204556.16496: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204556.16601: in run() - task 0affcd87-79f5-1303-fda8-000000001154 46400 1727204556.16612: variable 'ansible_search_path' from source: unknown 46400 1727204556.16616: variable 'ansible_search_path' from source: unknown 46400 1727204556.16645: calling self._execute() 46400 1727204556.16712: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.16716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.16725: variable 'omit' from source: magic vars 46400 1727204556.16995: variable 'ansible_distribution_major_version' from source: facts 46400 1727204556.17004: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204556.17119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204556.17317: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204556.17350: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204556.17378: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204556.17404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204556.17468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204556.17486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204556.17507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204556.17525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204556.17592: variable '__network_is_ostree' from source: set_fact 46400 1727204556.17597: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204556.17600: when evaluation is False, skipping this task 46400 1727204556.17603: _execute() done 46400 1727204556.17605: dumping result to json 46400 1727204556.17609: done dumping result, returning 46400 1727204556.17615: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000001154] 46400 1727204556.17623: sending task result for task 0affcd87-79f5-1303-fda8-000000001154 46400 1727204556.17714: done sending task result for task 0affcd87-79f5-1303-fda8-000000001154 46400 1727204556.17717: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204556.17781: no more pending results, returning what we have 46400 1727204556.17785: results queue empty 46400 1727204556.17786: checking for any_errors_fatal 46400 1727204556.17794: done checking for any_errors_fatal 46400 1727204556.17794: checking for max_fail_percentage 46400 1727204556.17796: done checking for max_fail_percentage 46400 1727204556.17797: checking to see if all hosts have failed and the running result is not ok 46400 1727204556.17798: done checking to see if all hosts have failed 46400 1727204556.17799: getting the remaining hosts for this loop 46400 1727204556.17801: done getting the remaining hosts for this loop 46400 1727204556.17804: getting the next task for host managed-node2 46400 1727204556.17813: done getting next task for host managed-node2 46400 1727204556.17817: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204556.17822: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204556.17846: getting variables 46400 1727204556.17848: in VariableManager get_vars() 46400 1727204556.17883: Calling all_inventory to load vars for managed-node2 46400 1727204556.17886: Calling groups_inventory to load vars for managed-node2 46400 1727204556.17888: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.17897: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.17899: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.17902: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.18703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.19656: done with get_vars() 46400 1727204556.19676: done getting variables 46400 1727204556.19721: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:36 -0400 (0:00:00.037) 0:00:46.482 ***** 46400 1727204556.19749: entering _queue_task() for managed-node2/set_fact 46400 1727204556.20000: worker is 1 (out of 1 available) 46400 1727204556.20013: exiting _queue_task() for managed-node2/set_fact 46400 1727204556.20026: done queuing things up, now waiting for results queue to drain 46400 1727204556.20028: waiting for pending results... 46400 1727204556.20327: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204556.20497: in run() - task 0affcd87-79f5-1303-fda8-000000001155 46400 1727204556.20517: variable 'ansible_search_path' from source: unknown 46400 1727204556.20523: variable 'ansible_search_path' from source: unknown 46400 1727204556.20566: calling self._execute() 46400 1727204556.20667: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.20678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.20700: variable 'omit' from source: magic vars 46400 1727204556.21076: variable 'ansible_distribution_major_version' from source: facts 46400 1727204556.21093: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204556.21278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204556.21525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204556.21567: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204556.21592: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204556.21617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204556.21683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204556.21700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204556.21719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204556.21738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204556.21805: variable '__network_is_ostree' from source: set_fact 46400 1727204556.21811: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204556.21814: when evaluation is False, skipping this task 46400 1727204556.21817: _execute() done 46400 1727204556.21820: dumping result to json 46400 1727204556.21822: done dumping result, returning 46400 1727204556.21830: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000001155] 46400 1727204556.21836: sending task result for task 0affcd87-79f5-1303-fda8-000000001155 46400 1727204556.21924: done sending task result for task 0affcd87-79f5-1303-fda8-000000001155 46400 1727204556.21927: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204556.21982: no more pending results, returning what we have 46400 1727204556.21987: results queue empty 46400 1727204556.21988: checking for any_errors_fatal 46400 1727204556.21995: done checking for any_errors_fatal 46400 1727204556.21995: checking for max_fail_percentage 46400 1727204556.21997: done checking for max_fail_percentage 46400 1727204556.21998: checking to see if all hosts have failed and the running result is not ok 46400 1727204556.21999: done checking to see if all hosts have failed 46400 1727204556.22000: getting the remaining hosts for this loop 46400 1727204556.22001: done getting the remaining hosts for this loop 46400 1727204556.22006: getting the next task for host managed-node2 46400 1727204556.22018: done getting next task for host managed-node2 46400 1727204556.22022: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204556.22027: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204556.22049: getting variables 46400 1727204556.22051: in VariableManager get_vars() 46400 1727204556.22089: Calling all_inventory to load vars for managed-node2 46400 1727204556.22092: Calling groups_inventory to load vars for managed-node2 46400 1727204556.22094: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204556.22104: Calling all_plugins_play to load vars for managed-node2 46400 1727204556.22106: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204556.22109: Calling groups_plugins_play to load vars for managed-node2 46400 1727204556.23020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204556.24541: done with get_vars() 46400 1727204556.24572: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:36 -0400 (0:00:00.049) 0:00:46.531 ***** 46400 1727204556.24678: entering _queue_task() for managed-node2/service_facts 46400 1727204556.25011: worker is 1 (out of 1 available) 46400 1727204556.25023: exiting _queue_task() for managed-node2/service_facts 46400 1727204556.25035: done queuing things up, now waiting for results queue to drain 46400 1727204556.25037: waiting for pending results... 46400 1727204556.25337: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204556.25507: in run() - task 0affcd87-79f5-1303-fda8-000000001157 46400 1727204556.25528: variable 'ansible_search_path' from source: unknown 46400 1727204556.25535: variable 'ansible_search_path' from source: unknown 46400 1727204556.25580: calling self._execute() 46400 1727204556.25682: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.25698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.25711: variable 'omit' from source: magic vars 46400 1727204556.26089: variable 'ansible_distribution_major_version' from source: facts 46400 1727204556.26105: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204556.26115: variable 'omit' from source: magic vars 46400 1727204556.26205: variable 'omit' from source: magic vars 46400 1727204556.26246: variable 'omit' from source: magic vars 46400 1727204556.26297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204556.26336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204556.26371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204556.26393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204556.26408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204556.26439: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204556.26448: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.26462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.26563: Set connection var ansible_shell_type to sh 46400 1727204556.26584: Set connection var ansible_shell_executable to /bin/sh 46400 1727204556.26594: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204556.26603: Set connection var ansible_connection to ssh 46400 1727204556.26612: Set connection var ansible_pipelining to False 46400 1727204556.26620: Set connection var ansible_timeout to 10 46400 1727204556.26647: variable 'ansible_shell_executable' from source: unknown 46400 1727204556.26654: variable 'ansible_connection' from source: unknown 46400 1727204556.26663: variable 'ansible_module_compression' from source: unknown 46400 1727204556.26672: variable 'ansible_shell_type' from source: unknown 46400 1727204556.26682: variable 'ansible_shell_executable' from source: unknown 46400 1727204556.26689: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204556.26696: variable 'ansible_pipelining' from source: unknown 46400 1727204556.26702: variable 'ansible_timeout' from source: unknown 46400 1727204556.26709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204556.26917: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204556.26932: variable 'omit' from source: magic vars 46400 1727204556.26941: starting attempt loop 46400 1727204556.26947: running the handler 46400 1727204556.26972: _low_level_execute_command(): starting 46400 1727204556.26984: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204556.27759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204556.27780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.27794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.27810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.27850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.27866: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204556.27883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.27904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204556.27917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204556.27930: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204556.27943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.27958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.27982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.28001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.28013: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204556.28028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.28112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204556.28138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204556.28155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204556.28242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204556.29897: stdout chunk (state=3): >>>/root <<< 46400 1727204556.30006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204556.30106: stderr chunk (state=3): >>><<< 46400 1727204556.30117: stdout chunk (state=3): >>><<< 46400 1727204556.30176: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204556.30180: _low_level_execute_command(): starting 46400 1727204556.30259: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511 `" && echo ansible-tmp-1727204556.301482-49617-212301195733511="` echo /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511 `" ) && sleep 0' 46400 1727204556.30955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204556.30974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.30997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.31024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.31094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.31113: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204556.31128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.31148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204556.31173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204556.31186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204556.31199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.31222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.31252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.31276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.31294: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204556.31309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.31399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204556.31427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204556.31449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204556.31555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204556.33398: stdout chunk (state=3): >>>ansible-tmp-1727204556.301482-49617-212301195733511=/root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511 <<< 46400 1727204556.33612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204556.33616: stdout chunk (state=3): >>><<< 46400 1727204556.33618: stderr chunk (state=3): >>><<< 46400 1727204556.33973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204556.301482-49617-212301195733511=/root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204556.33977: variable 'ansible_module_compression' from source: unknown 46400 1727204556.33980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204556.33982: variable 'ansible_facts' from source: unknown 46400 1727204556.33984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/AnsiballZ_service_facts.py 46400 1727204556.34051: Sending initial data 46400 1727204556.34054: Sent initial data (161 bytes) 46400 1727204556.35081: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204556.35102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.35118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.35137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.35190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.35197: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204556.35211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.35224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204556.35231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204556.35240: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204556.35248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.35257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.35275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.35282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.35289: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204556.35298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.35377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204556.35418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204556.35433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204556.35503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204556.37222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204556.37259: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204556.37303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp3vi16jv4 /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/AnsiballZ_service_facts.py <<< 46400 1727204556.37337: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204556.38672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204556.38684: stderr chunk (state=3): >>><<< 46400 1727204556.38688: stdout chunk (state=3): >>><<< 46400 1727204556.38711: done transferring module to remote 46400 1727204556.38721: _low_level_execute_command(): starting 46400 1727204556.38725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/ /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/AnsiballZ_service_facts.py && sleep 0' 46400 1727204556.39411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204556.39419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.39430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.39443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.39491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.39498: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204556.39507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.39520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204556.39528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204556.39534: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204556.39543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.39550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.39571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.39579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.39587: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204556.39597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.39679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204556.39693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204556.39698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204556.39771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204556.41570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204556.41617: stderr chunk (state=3): >>><<< 46400 1727204556.41621: stdout chunk (state=3): >>><<< 46400 1727204556.41670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204556.41674: _low_level_execute_command(): starting 46400 1727204556.41677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/AnsiballZ_service_facts.py && sleep 0' 46400 1727204556.42370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204556.42386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.42401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.42420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.42469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.42483: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204556.42500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.42519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204556.42532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204556.42549: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204556.42563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204556.42580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204556.42599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204556.42612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204556.42623: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204556.42636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204556.42715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204556.42733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204556.42748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204556.42842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204557.71570: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 46400 1727204557.71597: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204557.71603: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 46400 1727204557.71629: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 46400 1727204557.71637: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204557.72984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204557.73035: stderr chunk (state=3): >>><<< 46400 1727204557.73038: stdout chunk (state=3): >>><<< 46400 1727204557.73176: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204557.73806: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204557.73825: _low_level_execute_command(): starting 46400 1727204557.73834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204556.301482-49617-212301195733511/ > /dev/null 2>&1 && sleep 0' 46400 1727204557.74520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204557.74534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.74547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.74569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.74617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.74628: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204557.74642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.74658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204557.74677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204557.74690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204557.74701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.74713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.74727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.74738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.74748: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204557.74763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.74846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204557.74873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204557.74889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204557.74956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204557.76852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204557.76856: stdout chunk (state=3): >>><<< 46400 1727204557.76859: stderr chunk (state=3): >>><<< 46400 1727204557.77073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204557.77076: handler run complete 46400 1727204557.77188: variable 'ansible_facts' from source: unknown 46400 1727204557.77363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204557.77837: variable 'ansible_facts' from source: unknown 46400 1727204557.77978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204557.78191: attempt loop complete, returning result 46400 1727204557.78201: _execute() done 46400 1727204557.78208: dumping result to json 46400 1727204557.78277: done dumping result, returning 46400 1727204557.78290: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000001157] 46400 1727204557.78300: sending task result for task 0affcd87-79f5-1303-fda8-000000001157 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204557.79323: no more pending results, returning what we have 46400 1727204557.79327: results queue empty 46400 1727204557.79328: checking for any_errors_fatal 46400 1727204557.79334: done checking for any_errors_fatal 46400 1727204557.79335: checking for max_fail_percentage 46400 1727204557.79336: done checking for max_fail_percentage 46400 1727204557.79337: checking to see if all hosts have failed and the running result is not ok 46400 1727204557.79338: done checking to see if all hosts have failed 46400 1727204557.79339: getting the remaining hosts for this loop 46400 1727204557.79341: done getting the remaining hosts for this loop 46400 1727204557.79345: getting the next task for host managed-node2 46400 1727204557.79354: done getting next task for host managed-node2 46400 1727204557.79358: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204557.79369: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204557.79382: getting variables 46400 1727204557.79384: in VariableManager get_vars() 46400 1727204557.79424: Calling all_inventory to load vars for managed-node2 46400 1727204557.79427: Calling groups_inventory to load vars for managed-node2 46400 1727204557.79430: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204557.79441: Calling all_plugins_play to load vars for managed-node2 46400 1727204557.79444: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204557.79447: Calling groups_plugins_play to load vars for managed-node2 46400 1727204557.80450: done sending task result for task 0affcd87-79f5-1303-fda8-000000001157 46400 1727204557.80454: WORKER PROCESS EXITING 46400 1727204557.81347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204557.83119: done with get_vars() 46400 1727204557.83149: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:37 -0400 (0:00:01.585) 0:00:48.117 ***** 46400 1727204557.83269: entering _queue_task() for managed-node2/package_facts 46400 1727204557.83630: worker is 1 (out of 1 available) 46400 1727204557.83646: exiting _queue_task() for managed-node2/package_facts 46400 1727204557.83663: done queuing things up, now waiting for results queue to drain 46400 1727204557.83667: waiting for pending results... 46400 1727204557.83993: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204557.84187: in run() - task 0affcd87-79f5-1303-fda8-000000001158 46400 1727204557.84208: variable 'ansible_search_path' from source: unknown 46400 1727204557.84222: variable 'ansible_search_path' from source: unknown 46400 1727204557.84271: calling self._execute() 46400 1727204557.84377: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204557.84389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204557.84403: variable 'omit' from source: magic vars 46400 1727204557.84799: variable 'ansible_distribution_major_version' from source: facts 46400 1727204557.84821: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204557.84833: variable 'omit' from source: magic vars 46400 1727204557.84928: variable 'omit' from source: magic vars 46400 1727204557.84970: variable 'omit' from source: magic vars 46400 1727204557.85025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204557.85073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204557.85105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204557.85128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204557.85150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204557.85190: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204557.85203: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204557.85211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204557.85327: Set connection var ansible_shell_type to sh 46400 1727204557.85342: Set connection var ansible_shell_executable to /bin/sh 46400 1727204557.85353: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204557.85371: Set connection var ansible_connection to ssh 46400 1727204557.85381: Set connection var ansible_pipelining to False 46400 1727204557.85390: Set connection var ansible_timeout to 10 46400 1727204557.85422: variable 'ansible_shell_executable' from source: unknown 46400 1727204557.85429: variable 'ansible_connection' from source: unknown 46400 1727204557.85436: variable 'ansible_module_compression' from source: unknown 46400 1727204557.85441: variable 'ansible_shell_type' from source: unknown 46400 1727204557.85447: variable 'ansible_shell_executable' from source: unknown 46400 1727204557.85453: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204557.85464: variable 'ansible_pipelining' from source: unknown 46400 1727204557.85476: variable 'ansible_timeout' from source: unknown 46400 1727204557.85484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204557.85706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204557.85721: variable 'omit' from source: magic vars 46400 1727204557.85730: starting attempt loop 46400 1727204557.85740: running the handler 46400 1727204557.85759: _low_level_execute_command(): starting 46400 1727204557.85776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204557.86574: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204557.86590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.86604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.86623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.86675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.86693: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204557.86708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.86726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204557.86739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204557.86751: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204557.86767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.86782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.86801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.86814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.86825: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204557.86842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.86930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204557.86953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204557.86979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204557.87056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204557.88676: stdout chunk (state=3): >>>/root <<< 46400 1727204557.88788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204557.88885: stderr chunk (state=3): >>><<< 46400 1727204557.88891: stdout chunk (state=3): >>><<< 46400 1727204557.89014: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204557.89018: _low_level_execute_command(): starting 46400 1727204557.89021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313 `" && echo ansible-tmp-1727204557.8891215-49662-76221982762313="` echo /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313 `" ) && sleep 0' 46400 1727204557.90026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.90030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.90069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.90073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.90076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.90144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204557.90188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204557.90283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204557.92147: stdout chunk (state=3): >>>ansible-tmp-1727204557.8891215-49662-76221982762313=/root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313 <<< 46400 1727204557.92280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204557.92347: stderr chunk (state=3): >>><<< 46400 1727204557.92350: stdout chunk (state=3): >>><<< 46400 1727204557.92373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204557.8891215-49662-76221982762313=/root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204557.92424: variable 'ansible_module_compression' from source: unknown 46400 1727204557.92486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204557.92543: variable 'ansible_facts' from source: unknown 46400 1727204557.92746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/AnsiballZ_package_facts.py 46400 1727204557.92912: Sending initial data 46400 1727204557.92916: Sent initial data (161 bytes) 46400 1727204557.93918: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204557.93928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.93939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.93953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.94013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.94016: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204557.94053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.94057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204557.94062: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204557.94067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204557.94069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204557.94071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204557.94083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204557.94090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204557.94097: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204557.94107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204557.94187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204557.94205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204557.94217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204557.94297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204557.96024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204557.96040: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204557.96086: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmppn0nu2ja /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/AnsiballZ_package_facts.py <<< 46400 1727204557.96123: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204557.99139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204557.99232: stderr chunk (state=3): >>><<< 46400 1727204557.99236: stdout chunk (state=3): >>><<< 46400 1727204557.99262: done transferring module to remote 46400 1727204557.99273: _low_level_execute_command(): starting 46400 1727204557.99278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/ /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/AnsiballZ_package_facts.py && sleep 0' 46400 1727204558.00040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204558.00050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204558.00066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204558.00079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.00120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204558.00136: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204558.00146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204558.00163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204558.00170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204558.00178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204558.00186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204558.00196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204558.00207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.00215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204558.00222: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204558.00234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204558.00312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204558.00329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204558.00348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204558.00427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204558.02182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204558.02234: stderr chunk (state=3): >>><<< 46400 1727204558.02238: stdout chunk (state=3): >>><<< 46400 1727204558.02255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204558.02259: _low_level_execute_command(): starting 46400 1727204558.02267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/AnsiballZ_package_facts.py && sleep 0' 46400 1727204558.03544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204558.04182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204558.04198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204558.04218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.04258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204558.04277: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204558.04291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204558.04308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204558.04322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204558.04333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204558.04344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204558.04362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204558.04385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.04399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204558.04411: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204558.04425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204558.04504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204558.04526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204558.04544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204558.04624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204558.50900: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "lib<<< 46400 1727204558.50950: stdout chunk (state=3): >>>xml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.<<< 46400 1727204558.50960: stdout chunk (state=3): >>>37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux":<<< 46400 1727204558.50986: stdout chunk (state=3): >>> [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86<<< 46400 1727204558.51012: stdout chunk (state=3): >>>_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "r<<< 46400 1727204558.51031: stdout chunk (state=3): >>>elease": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "sourc<<< 46400 1727204558.51041: stdout chunk (state=3): >>>e": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch":<<< 46400 1727204558.51065: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "p<<< 46400 1727204558.51077: stdout chunk (state=3): >>>erl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap"<<< 46400 1727204558.51110: stdout chunk (state=3): >>>: [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 46400 1727204558.51116: stdout chunk (state=3): >>>}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204558.52642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204558.52707: stderr chunk (state=3): >>><<< 46400 1727204558.52711: stdout chunk (state=3): >>><<< 46400 1727204558.52751: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204558.54180: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204558.54199: _low_level_execute_command(): starting 46400 1727204558.54203: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204557.8891215-49662-76221982762313/ > /dev/null 2>&1 && sleep 0' 46400 1727204558.54678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204558.54693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.54707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204558.54720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204558.54734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204558.54776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204558.54788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204558.54854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204558.56650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204558.56710: stderr chunk (state=3): >>><<< 46400 1727204558.56713: stdout chunk (state=3): >>><<< 46400 1727204558.56726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204558.56733: handler run complete 46400 1727204558.57226: variable 'ansible_facts' from source: unknown 46400 1727204558.57523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.58732: variable 'ansible_facts' from source: unknown 46400 1727204558.58995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.59422: attempt loop complete, returning result 46400 1727204558.59432: _execute() done 46400 1727204558.59435: dumping result to json 46400 1727204558.59557: done dumping result, returning 46400 1727204558.59567: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000001158] 46400 1727204558.59573: sending task result for task 0affcd87-79f5-1303-fda8-000000001158 46400 1727204558.60925: done sending task result for task 0affcd87-79f5-1303-fda8-000000001158 46400 1727204558.60929: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204558.61023: no more pending results, returning what we have 46400 1727204558.61026: results queue empty 46400 1727204558.61026: checking for any_errors_fatal 46400 1727204558.61030: done checking for any_errors_fatal 46400 1727204558.61030: checking for max_fail_percentage 46400 1727204558.61031: done checking for max_fail_percentage 46400 1727204558.61032: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.61033: done checking to see if all hosts have failed 46400 1727204558.61033: getting the remaining hosts for this loop 46400 1727204558.61034: done getting the remaining hosts for this loop 46400 1727204558.61037: getting the next task for host managed-node2 46400 1727204558.61042: done getting next task for host managed-node2 46400 1727204558.61046: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204558.61049: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.61057: getting variables 46400 1727204558.61057: in VariableManager get_vars() 46400 1727204558.61083: Calling all_inventory to load vars for managed-node2 46400 1727204558.61085: Calling groups_inventory to load vars for managed-node2 46400 1727204558.61086: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.61093: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.61095: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.61096: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.61852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.62775: done with get_vars() 46400 1727204558.62793: done getting variables 46400 1727204558.62840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.796) 0:00:48.913 ***** 46400 1727204558.62878: entering _queue_task() for managed-node2/debug 46400 1727204558.63122: worker is 1 (out of 1 available) 46400 1727204558.63138: exiting _queue_task() for managed-node2/debug 46400 1727204558.63152: done queuing things up, now waiting for results queue to drain 46400 1727204558.63153: waiting for pending results... 46400 1727204558.63339: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204558.63431: in run() - task 0affcd87-79f5-1303-fda8-0000000010f6 46400 1727204558.63441: variable 'ansible_search_path' from source: unknown 46400 1727204558.63445: variable 'ansible_search_path' from source: unknown 46400 1727204558.63477: calling self._execute() 46400 1727204558.63546: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.63551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.63567: variable 'omit' from source: magic vars 46400 1727204558.63834: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.63845: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.63851: variable 'omit' from source: magic vars 46400 1727204558.63896: variable 'omit' from source: magic vars 46400 1727204558.63966: variable 'network_provider' from source: set_fact 46400 1727204558.63986: variable 'omit' from source: magic vars 46400 1727204558.64021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204558.64053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204558.64077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204558.64090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204558.64099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204558.64122: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204558.64125: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.64129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.64200: Set connection var ansible_shell_type to sh 46400 1727204558.64209: Set connection var ansible_shell_executable to /bin/sh 46400 1727204558.64214: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204558.64219: Set connection var ansible_connection to ssh 46400 1727204558.64224: Set connection var ansible_pipelining to False 46400 1727204558.64229: Set connection var ansible_timeout to 10 46400 1727204558.64248: variable 'ansible_shell_executable' from source: unknown 46400 1727204558.64253: variable 'ansible_connection' from source: unknown 46400 1727204558.64256: variable 'ansible_module_compression' from source: unknown 46400 1727204558.64259: variable 'ansible_shell_type' from source: unknown 46400 1727204558.64268: variable 'ansible_shell_executable' from source: unknown 46400 1727204558.64271: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.64275: variable 'ansible_pipelining' from source: unknown 46400 1727204558.64278: variable 'ansible_timeout' from source: unknown 46400 1727204558.64282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.64387: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204558.64396: variable 'omit' from source: magic vars 46400 1727204558.64401: starting attempt loop 46400 1727204558.64404: running the handler 46400 1727204558.64439: handler run complete 46400 1727204558.64449: attempt loop complete, returning result 46400 1727204558.64453: _execute() done 46400 1727204558.64455: dumping result to json 46400 1727204558.64457: done dumping result, returning 46400 1727204558.64463: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-0000000010f6] 46400 1727204558.64475: sending task result for task 0affcd87-79f5-1303-fda8-0000000010f6 46400 1727204558.64557: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010f6 46400 1727204558.64560: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204558.64635: no more pending results, returning what we have 46400 1727204558.64638: results queue empty 46400 1727204558.64639: checking for any_errors_fatal 46400 1727204558.64650: done checking for any_errors_fatal 46400 1727204558.64651: checking for max_fail_percentage 46400 1727204558.64653: done checking for max_fail_percentage 46400 1727204558.64654: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.64655: done checking to see if all hosts have failed 46400 1727204558.64655: getting the remaining hosts for this loop 46400 1727204558.64657: done getting the remaining hosts for this loop 46400 1727204558.64661: getting the next task for host managed-node2 46400 1727204558.64671: done getting next task for host managed-node2 46400 1727204558.64676: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204558.64681: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.64694: getting variables 46400 1727204558.64696: in VariableManager get_vars() 46400 1727204558.64730: Calling all_inventory to load vars for managed-node2 46400 1727204558.64733: Calling groups_inventory to load vars for managed-node2 46400 1727204558.64735: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.64744: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.64746: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.64749: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.65543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.66574: done with get_vars() 46400 1727204558.66590: done getting variables 46400 1727204558.66633: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.037) 0:00:48.951 ***** 46400 1727204558.66668: entering _queue_task() for managed-node2/fail 46400 1727204558.66905: worker is 1 (out of 1 available) 46400 1727204558.66920: exiting _queue_task() for managed-node2/fail 46400 1727204558.66933: done queuing things up, now waiting for results queue to drain 46400 1727204558.66935: waiting for pending results... 46400 1727204558.67118: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204558.67219: in run() - task 0affcd87-79f5-1303-fda8-0000000010f7 46400 1727204558.67228: variable 'ansible_search_path' from source: unknown 46400 1727204558.67232: variable 'ansible_search_path' from source: unknown 46400 1727204558.67260: calling self._execute() 46400 1727204558.67334: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.67337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.67346: variable 'omit' from source: magic vars 46400 1727204558.67626: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.67636: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.67722: variable 'network_state' from source: role '' defaults 46400 1727204558.67730: Evaluated conditional (network_state != {}): False 46400 1727204558.67737: when evaluation is False, skipping this task 46400 1727204558.67740: _execute() done 46400 1727204558.67742: dumping result to json 46400 1727204558.67745: done dumping result, returning 46400 1727204558.67752: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-0000000010f7] 46400 1727204558.67757: sending task result for task 0affcd87-79f5-1303-fda8-0000000010f7 46400 1727204558.67847: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010f7 46400 1727204558.67850: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204558.67898: no more pending results, returning what we have 46400 1727204558.67902: results queue empty 46400 1727204558.67903: checking for any_errors_fatal 46400 1727204558.67913: done checking for any_errors_fatal 46400 1727204558.67913: checking for max_fail_percentage 46400 1727204558.67915: done checking for max_fail_percentage 46400 1727204558.67916: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.67917: done checking to see if all hosts have failed 46400 1727204558.67917: getting the remaining hosts for this loop 46400 1727204558.67919: done getting the remaining hosts for this loop 46400 1727204558.67923: getting the next task for host managed-node2 46400 1727204558.67932: done getting next task for host managed-node2 46400 1727204558.67936: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204558.67940: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.67965: getting variables 46400 1727204558.67967: in VariableManager get_vars() 46400 1727204558.68000: Calling all_inventory to load vars for managed-node2 46400 1727204558.68003: Calling groups_inventory to load vars for managed-node2 46400 1727204558.68005: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.68014: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.68017: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.68020: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.68803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.69710: done with get_vars() 46400 1727204558.69726: done getting variables 46400 1727204558.69774: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.031) 0:00:48.982 ***** 46400 1727204558.69801: entering _queue_task() for managed-node2/fail 46400 1727204558.70027: worker is 1 (out of 1 available) 46400 1727204558.70043: exiting _queue_task() for managed-node2/fail 46400 1727204558.70056: done queuing things up, now waiting for results queue to drain 46400 1727204558.70057: waiting for pending results... 46400 1727204558.70238: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204558.70328: in run() - task 0affcd87-79f5-1303-fda8-0000000010f8 46400 1727204558.70339: variable 'ansible_search_path' from source: unknown 46400 1727204558.70343: variable 'ansible_search_path' from source: unknown 46400 1727204558.70373: calling self._execute() 46400 1727204558.70443: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.70447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.70456: variable 'omit' from source: magic vars 46400 1727204558.70723: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.70732: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.70820: variable 'network_state' from source: role '' defaults 46400 1727204558.70828: Evaluated conditional (network_state != {}): False 46400 1727204558.70834: when evaluation is False, skipping this task 46400 1727204558.70837: _execute() done 46400 1727204558.70840: dumping result to json 46400 1727204558.70843: done dumping result, returning 46400 1727204558.70847: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-0000000010f8] 46400 1727204558.70852: sending task result for task 0affcd87-79f5-1303-fda8-0000000010f8 46400 1727204558.70944: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010f8 46400 1727204558.70947: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204558.71003: no more pending results, returning what we have 46400 1727204558.71007: results queue empty 46400 1727204558.71008: checking for any_errors_fatal 46400 1727204558.71017: done checking for any_errors_fatal 46400 1727204558.71018: checking for max_fail_percentage 46400 1727204558.71019: done checking for max_fail_percentage 46400 1727204558.71020: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.71021: done checking to see if all hosts have failed 46400 1727204558.71022: getting the remaining hosts for this loop 46400 1727204558.71023: done getting the remaining hosts for this loop 46400 1727204558.71027: getting the next task for host managed-node2 46400 1727204558.71035: done getting next task for host managed-node2 46400 1727204558.71039: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204558.71044: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.71063: getting variables 46400 1727204558.71066: in VariableManager get_vars() 46400 1727204558.71098: Calling all_inventory to load vars for managed-node2 46400 1727204558.71101: Calling groups_inventory to load vars for managed-node2 46400 1727204558.71103: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.71113: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.71115: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.71118: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.72013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.76515: done with get_vars() 46400 1727204558.76536: done getting variables 46400 1727204558.76575: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.067) 0:00:49.050 ***** 46400 1727204558.76597: entering _queue_task() for managed-node2/fail 46400 1727204558.76835: worker is 1 (out of 1 available) 46400 1727204558.76851: exiting _queue_task() for managed-node2/fail 46400 1727204558.76865: done queuing things up, now waiting for results queue to drain 46400 1727204558.76868: waiting for pending results... 46400 1727204558.77051: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204558.77161: in run() - task 0affcd87-79f5-1303-fda8-0000000010f9 46400 1727204558.77176: variable 'ansible_search_path' from source: unknown 46400 1727204558.77181: variable 'ansible_search_path' from source: unknown 46400 1727204558.77209: calling self._execute() 46400 1727204558.77285: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.77290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.77297: variable 'omit' from source: magic vars 46400 1727204558.77579: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.77587: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.77717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204558.79348: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204558.79405: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204558.79432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204558.79458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204558.79482: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204558.79539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.79558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.79580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.79612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.79623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.79701: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.79713: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204558.79716: when evaluation is False, skipping this task 46400 1727204558.79719: _execute() done 46400 1727204558.79722: dumping result to json 46400 1727204558.79724: done dumping result, returning 46400 1727204558.79731: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-0000000010f9] 46400 1727204558.79736: sending task result for task 0affcd87-79f5-1303-fda8-0000000010f9 46400 1727204558.79828: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010f9 46400 1727204558.79831: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204558.79880: no more pending results, returning what we have 46400 1727204558.79884: results queue empty 46400 1727204558.79885: checking for any_errors_fatal 46400 1727204558.79893: done checking for any_errors_fatal 46400 1727204558.79894: checking for max_fail_percentage 46400 1727204558.79895: done checking for max_fail_percentage 46400 1727204558.79896: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.79897: done checking to see if all hosts have failed 46400 1727204558.79897: getting the remaining hosts for this loop 46400 1727204558.79899: done getting the remaining hosts for this loop 46400 1727204558.79903: getting the next task for host managed-node2 46400 1727204558.79912: done getting next task for host managed-node2 46400 1727204558.79916: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204558.79921: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.79948: getting variables 46400 1727204558.79950: in VariableManager get_vars() 46400 1727204558.79990: Calling all_inventory to load vars for managed-node2 46400 1727204558.79992: Calling groups_inventory to load vars for managed-node2 46400 1727204558.79994: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.80004: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.80006: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.80008: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.80827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.81752: done with get_vars() 46400 1727204558.81775: done getting variables 46400 1727204558.81820: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.052) 0:00:49.103 ***** 46400 1727204558.81846: entering _queue_task() for managed-node2/dnf 46400 1727204558.82092: worker is 1 (out of 1 available) 46400 1727204558.82106: exiting _queue_task() for managed-node2/dnf 46400 1727204558.82121: done queuing things up, now waiting for results queue to drain 46400 1727204558.82123: waiting for pending results... 46400 1727204558.82304: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204558.82393: in run() - task 0affcd87-79f5-1303-fda8-0000000010fa 46400 1727204558.82404: variable 'ansible_search_path' from source: unknown 46400 1727204558.82408: variable 'ansible_search_path' from source: unknown 46400 1727204558.82439: calling self._execute() 46400 1727204558.82519: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.82523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.82531: variable 'omit' from source: magic vars 46400 1727204558.82809: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.82820: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.82956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204558.84579: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204558.84896: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204558.84923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204558.84951: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204558.84973: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204558.85029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.85050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.85076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.85103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.85113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.85202: variable 'ansible_distribution' from source: facts 46400 1727204558.85206: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.85217: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204558.85299: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204558.85385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.85402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.85419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.85444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.85455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.85489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.85506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.85522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.85548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.85557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.85592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.85608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.85625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.85650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.85660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.85765: variable 'network_connections' from source: include params 46400 1727204558.85778: variable 'interface' from source: play vars 46400 1727204558.85827: variable 'interface' from source: play vars 46400 1727204558.85884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204558.86000: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204558.86028: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204558.86051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204558.86077: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204558.86120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204558.86138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204558.86159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.86181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204558.86224: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204558.86377: variable 'network_connections' from source: include params 46400 1727204558.86381: variable 'interface' from source: play vars 46400 1727204558.86423: variable 'interface' from source: play vars 46400 1727204558.86453: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204558.86456: when evaluation is False, skipping this task 46400 1727204558.86459: _execute() done 46400 1727204558.86461: dumping result to json 46400 1727204558.86463: done dumping result, returning 46400 1727204558.86471: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000010fa] 46400 1727204558.86477: sending task result for task 0affcd87-79f5-1303-fda8-0000000010fa 46400 1727204558.86572: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010fa 46400 1727204558.86575: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204558.86630: no more pending results, returning what we have 46400 1727204558.86635: results queue empty 46400 1727204558.86636: checking for any_errors_fatal 46400 1727204558.86641: done checking for any_errors_fatal 46400 1727204558.86641: checking for max_fail_percentage 46400 1727204558.86643: done checking for max_fail_percentage 46400 1727204558.86644: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.86645: done checking to see if all hosts have failed 46400 1727204558.86645: getting the remaining hosts for this loop 46400 1727204558.86647: done getting the remaining hosts for this loop 46400 1727204558.86651: getting the next task for host managed-node2 46400 1727204558.86659: done getting next task for host managed-node2 46400 1727204558.86668: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204558.86673: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.86693: getting variables 46400 1727204558.86695: in VariableManager get_vars() 46400 1727204558.86733: Calling all_inventory to load vars for managed-node2 46400 1727204558.86736: Calling groups_inventory to load vars for managed-node2 46400 1727204558.86738: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.86747: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.86750: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.86752: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.87735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.88627: done with get_vars() 46400 1727204558.88644: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204558.88701: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.068) 0:00:49.171 ***** 46400 1727204558.88728: entering _queue_task() for managed-node2/yum 46400 1727204558.88967: worker is 1 (out of 1 available) 46400 1727204558.88984: exiting _queue_task() for managed-node2/yum 46400 1727204558.88996: done queuing things up, now waiting for results queue to drain 46400 1727204558.88998: waiting for pending results... 46400 1727204558.89189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204558.89282: in run() - task 0affcd87-79f5-1303-fda8-0000000010fb 46400 1727204558.89292: variable 'ansible_search_path' from source: unknown 46400 1727204558.89296: variable 'ansible_search_path' from source: unknown 46400 1727204558.89327: calling self._execute() 46400 1727204558.89400: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.89404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.89412: variable 'omit' from source: magic vars 46400 1727204558.89697: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.89705: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.89835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204558.91745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204558.91827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204558.91873: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204558.91915: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204558.91948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204558.92032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204558.92071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204558.92102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204558.92149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204558.92182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204558.92288: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.92309: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204558.92317: when evaluation is False, skipping this task 46400 1727204558.92324: _execute() done 46400 1727204558.92332: dumping result to json 46400 1727204558.92338: done dumping result, returning 46400 1727204558.92348: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000010fb] 46400 1727204558.92359: sending task result for task 0affcd87-79f5-1303-fda8-0000000010fb skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204558.92540: no more pending results, returning what we have 46400 1727204558.92544: results queue empty 46400 1727204558.92545: checking for any_errors_fatal 46400 1727204558.92553: done checking for any_errors_fatal 46400 1727204558.92554: checking for max_fail_percentage 46400 1727204558.92555: done checking for max_fail_percentage 46400 1727204558.92556: checking to see if all hosts have failed and the running result is not ok 46400 1727204558.92557: done checking to see if all hosts have failed 46400 1727204558.92558: getting the remaining hosts for this loop 46400 1727204558.92562: done getting the remaining hosts for this loop 46400 1727204558.92570: getting the next task for host managed-node2 46400 1727204558.92580: done getting next task for host managed-node2 46400 1727204558.92584: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204558.92591: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204558.92613: getting variables 46400 1727204558.92615: in VariableManager get_vars() 46400 1727204558.92658: Calling all_inventory to load vars for managed-node2 46400 1727204558.92666: Calling groups_inventory to load vars for managed-node2 46400 1727204558.92670: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204558.92681: Calling all_plugins_play to load vars for managed-node2 46400 1727204558.92684: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204558.92687: Calling groups_plugins_play to load vars for managed-node2 46400 1727204558.93808: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010fb 46400 1727204558.93812: WORKER PROCESS EXITING 46400 1727204558.94478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204558.96249: done with get_vars() 46400 1727204558.96283: done getting variables 46400 1727204558.96348: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:38 -0400 (0:00:00.076) 0:00:49.248 ***** 46400 1727204558.96394: entering _queue_task() for managed-node2/fail 46400 1727204558.96754: worker is 1 (out of 1 available) 46400 1727204558.96776: exiting _queue_task() for managed-node2/fail 46400 1727204558.96790: done queuing things up, now waiting for results queue to drain 46400 1727204558.96792: waiting for pending results... 46400 1727204558.97097: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204558.97273: in run() - task 0affcd87-79f5-1303-fda8-0000000010fc 46400 1727204558.97292: variable 'ansible_search_path' from source: unknown 46400 1727204558.97300: variable 'ansible_search_path' from source: unknown 46400 1727204558.97348: calling self._execute() 46400 1727204558.97457: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204558.97476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204558.97491: variable 'omit' from source: magic vars 46400 1727204558.97902: variable 'ansible_distribution_major_version' from source: facts 46400 1727204558.97920: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204558.98058: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204558.98278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204559.01108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204559.01193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204559.01251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204559.01301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204559.01334: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204559.01422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.01457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.01499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.01546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.01570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.01626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.01657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.01694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.01743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.01768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.01817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.01847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.01883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.01929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.01950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.02151: variable 'network_connections' from source: include params 46400 1727204559.02173: variable 'interface' from source: play vars 46400 1727204559.02251: variable 'interface' from source: play vars 46400 1727204559.02331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204559.02517: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204559.02559: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204559.02604: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204559.02647: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204559.02702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204559.02730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204559.02758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.02799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204559.02870: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204559.03137: variable 'network_connections' from source: include params 46400 1727204559.03148: variable 'interface' from source: play vars 46400 1727204559.03215: variable 'interface' from source: play vars 46400 1727204559.03256: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204559.03271: when evaluation is False, skipping this task 46400 1727204559.03278: _execute() done 46400 1727204559.03285: dumping result to json 46400 1727204559.03293: done dumping result, returning 46400 1727204559.03305: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000010fc] 46400 1727204559.03316: sending task result for task 0affcd87-79f5-1303-fda8-0000000010fc skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204559.03501: no more pending results, returning what we have 46400 1727204559.03505: results queue empty 46400 1727204559.03507: checking for any_errors_fatal 46400 1727204559.03514: done checking for any_errors_fatal 46400 1727204559.03515: checking for max_fail_percentage 46400 1727204559.03517: done checking for max_fail_percentage 46400 1727204559.03518: checking to see if all hosts have failed and the running result is not ok 46400 1727204559.03519: done checking to see if all hosts have failed 46400 1727204559.03519: getting the remaining hosts for this loop 46400 1727204559.03521: done getting the remaining hosts for this loop 46400 1727204559.03526: getting the next task for host managed-node2 46400 1727204559.03536: done getting next task for host managed-node2 46400 1727204559.03540: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204559.03546: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204559.03571: getting variables 46400 1727204559.03573: in VariableManager get_vars() 46400 1727204559.03618: Calling all_inventory to load vars for managed-node2 46400 1727204559.03621: Calling groups_inventory to load vars for managed-node2 46400 1727204559.03624: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204559.03635: Calling all_plugins_play to load vars for managed-node2 46400 1727204559.03639: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204559.03643: Calling groups_plugins_play to load vars for managed-node2 46400 1727204559.04717: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010fc 46400 1727204559.04721: WORKER PROCESS EXITING 46400 1727204559.05597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204559.07280: done with get_vars() 46400 1727204559.07304: done getting variables 46400 1727204559.07370: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:39 -0400 (0:00:00.110) 0:00:49.358 ***** 46400 1727204559.07410: entering _queue_task() for managed-node2/package 46400 1727204559.07748: worker is 1 (out of 1 available) 46400 1727204559.07767: exiting _queue_task() for managed-node2/package 46400 1727204559.07782: done queuing things up, now waiting for results queue to drain 46400 1727204559.07784: waiting for pending results... 46400 1727204559.08086: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204559.08240: in run() - task 0affcd87-79f5-1303-fda8-0000000010fd 46400 1727204559.08263: variable 'ansible_search_path' from source: unknown 46400 1727204559.08276: variable 'ansible_search_path' from source: unknown 46400 1727204559.08316: calling self._execute() 46400 1727204559.08416: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.08428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.08449: variable 'omit' from source: magic vars 46400 1727204559.08844: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.08868: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204559.09086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204559.09376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204559.09429: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204559.09474: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204559.09555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204559.09682: variable 'network_packages' from source: role '' defaults 46400 1727204559.09796: variable '__network_provider_setup' from source: role '' defaults 46400 1727204559.09812: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204559.09885: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204559.09897: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204559.09964: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204559.10158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204559.12389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204559.12459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204559.12508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204559.12546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204559.12586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204559.12682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.12720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.12750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.12804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.12824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.12880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.12914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.12945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.12995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.13019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.13275: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204559.13400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.13428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.13466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.13511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.13530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.13635: variable 'ansible_python' from source: facts 46400 1727204559.13663: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204559.13754: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204559.13849: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204559.13992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.14023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.14054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.14105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.14126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.14181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.14221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.14250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.14298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.14318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.14480: variable 'network_connections' from source: include params 46400 1727204559.14492: variable 'interface' from source: play vars 46400 1727204559.14601: variable 'interface' from source: play vars 46400 1727204559.14685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204559.14716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204559.14750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.14793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204559.14847: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204559.15149: variable 'network_connections' from source: include params 46400 1727204559.15158: variable 'interface' from source: play vars 46400 1727204559.15270: variable 'interface' from source: play vars 46400 1727204559.15330: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204559.15422: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204559.15742: variable 'network_connections' from source: include params 46400 1727204559.15752: variable 'interface' from source: play vars 46400 1727204559.15820: variable 'interface' from source: play vars 46400 1727204559.15854: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204559.15938: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204559.16272: variable 'network_connections' from source: include params 46400 1727204559.16286: variable 'interface' from source: play vars 46400 1727204559.16349: variable 'interface' from source: play vars 46400 1727204559.16424: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204559.16489: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204559.16506: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204559.16569: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204559.16798: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204559.17327: variable 'network_connections' from source: include params 46400 1727204559.17338: variable 'interface' from source: play vars 46400 1727204559.17407: variable 'interface' from source: play vars 46400 1727204559.17421: variable 'ansible_distribution' from source: facts 46400 1727204559.17429: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.17439: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.17479: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204559.17654: variable 'ansible_distribution' from source: facts 46400 1727204559.17670: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.17681: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.17698: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204559.17874: variable 'ansible_distribution' from source: facts 46400 1727204559.17884: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.17894: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.17937: variable 'network_provider' from source: set_fact 46400 1727204559.17958: variable 'ansible_facts' from source: unknown 46400 1727204559.18912: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204559.18920: when evaluation is False, skipping this task 46400 1727204559.18927: _execute() done 46400 1727204559.18935: dumping result to json 46400 1727204559.18942: done dumping result, returning 46400 1727204559.18953: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-0000000010fd] 46400 1727204559.18969: sending task result for task 0affcd87-79f5-1303-fda8-0000000010fd 46400 1727204559.19093: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010fd skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204559.19145: no more pending results, returning what we have 46400 1727204559.19149: results queue empty 46400 1727204559.19151: checking for any_errors_fatal 46400 1727204559.19159: done checking for any_errors_fatal 46400 1727204559.19163: checking for max_fail_percentage 46400 1727204559.19168: done checking for max_fail_percentage 46400 1727204559.19169: checking to see if all hosts have failed and the running result is not ok 46400 1727204559.19170: done checking to see if all hosts have failed 46400 1727204559.19170: getting the remaining hosts for this loop 46400 1727204559.19172: done getting the remaining hosts for this loop 46400 1727204559.19177: getting the next task for host managed-node2 46400 1727204559.19186: done getting next task for host managed-node2 46400 1727204559.19191: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204559.19197: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204559.19218: getting variables 46400 1727204559.19220: in VariableManager get_vars() 46400 1727204559.19269: Calling all_inventory to load vars for managed-node2 46400 1727204559.19276: Calling groups_inventory to load vars for managed-node2 46400 1727204559.19279: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204559.19290: Calling all_plugins_play to load vars for managed-node2 46400 1727204559.19294: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204559.19297: Calling groups_plugins_play to load vars for managed-node2 46400 1727204559.20283: WORKER PROCESS EXITING 46400 1727204559.21073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204559.22963: done with get_vars() 46400 1727204559.22987: done getting variables 46400 1727204559.23049: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:39 -0400 (0:00:00.156) 0:00:49.515 ***** 46400 1727204559.23091: entering _queue_task() for managed-node2/package 46400 1727204559.23425: worker is 1 (out of 1 available) 46400 1727204559.23437: exiting _queue_task() for managed-node2/package 46400 1727204559.23451: done queuing things up, now waiting for results queue to drain 46400 1727204559.23453: waiting for pending results... 46400 1727204559.23752: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204559.23907: in run() - task 0affcd87-79f5-1303-fda8-0000000010fe 46400 1727204559.23929: variable 'ansible_search_path' from source: unknown 46400 1727204559.23937: variable 'ansible_search_path' from source: unknown 46400 1727204559.23981: calling self._execute() 46400 1727204559.24084: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.24098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.24118: variable 'omit' from source: magic vars 46400 1727204559.24512: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.24531: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204559.24672: variable 'network_state' from source: role '' defaults 46400 1727204559.24688: Evaluated conditional (network_state != {}): False 46400 1727204559.24697: when evaluation is False, skipping this task 46400 1727204559.24704: _execute() done 46400 1727204559.24711: dumping result to json 46400 1727204559.24718: done dumping result, returning 46400 1727204559.24730: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000010fe] 46400 1727204559.24741: sending task result for task 0affcd87-79f5-1303-fda8-0000000010fe 46400 1727204559.24868: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010fe skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204559.24917: no more pending results, returning what we have 46400 1727204559.24922: results queue empty 46400 1727204559.24923: checking for any_errors_fatal 46400 1727204559.24931: done checking for any_errors_fatal 46400 1727204559.24932: checking for max_fail_percentage 46400 1727204559.24934: done checking for max_fail_percentage 46400 1727204559.24935: checking to see if all hosts have failed and the running result is not ok 46400 1727204559.24936: done checking to see if all hosts have failed 46400 1727204559.24937: getting the remaining hosts for this loop 46400 1727204559.24938: done getting the remaining hosts for this loop 46400 1727204559.24943: getting the next task for host managed-node2 46400 1727204559.24953: done getting next task for host managed-node2 46400 1727204559.24957: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204559.24968: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204559.24994: getting variables 46400 1727204559.24996: in VariableManager get_vars() 46400 1727204559.25037: Calling all_inventory to load vars for managed-node2 46400 1727204559.25040: Calling groups_inventory to load vars for managed-node2 46400 1727204559.25043: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204559.25056: Calling all_plugins_play to load vars for managed-node2 46400 1727204559.25063: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204559.25068: Calling groups_plugins_play to load vars for managed-node2 46400 1727204559.26132: WORKER PROCESS EXITING 46400 1727204559.26804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204559.28537: done with get_vars() 46400 1727204559.28565: done getting variables 46400 1727204559.28627: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:39 -0400 (0:00:00.055) 0:00:49.571 ***** 46400 1727204559.28670: entering _queue_task() for managed-node2/package 46400 1727204559.28994: worker is 1 (out of 1 available) 46400 1727204559.29008: exiting _queue_task() for managed-node2/package 46400 1727204559.29022: done queuing things up, now waiting for results queue to drain 46400 1727204559.29024: waiting for pending results... 46400 1727204559.29325: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204559.29484: in run() - task 0affcd87-79f5-1303-fda8-0000000010ff 46400 1727204559.29504: variable 'ansible_search_path' from source: unknown 46400 1727204559.29513: variable 'ansible_search_path' from source: unknown 46400 1727204559.29550: calling self._execute() 46400 1727204559.29652: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.29672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.29691: variable 'omit' from source: magic vars 46400 1727204559.30079: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.30097: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204559.30237: variable 'network_state' from source: role '' defaults 46400 1727204559.30252: Evaluated conditional (network_state != {}): False 46400 1727204559.30265: when evaluation is False, skipping this task 46400 1727204559.30273: _execute() done 46400 1727204559.30281: dumping result to json 46400 1727204559.30289: done dumping result, returning 46400 1727204559.30299: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000010ff] 46400 1727204559.30312: sending task result for task 0affcd87-79f5-1303-fda8-0000000010ff 46400 1727204559.30431: done sending task result for task 0affcd87-79f5-1303-fda8-0000000010ff 46400 1727204559.30439: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204559.30495: no more pending results, returning what we have 46400 1727204559.30500: results queue empty 46400 1727204559.30501: checking for any_errors_fatal 46400 1727204559.30508: done checking for any_errors_fatal 46400 1727204559.30509: checking for max_fail_percentage 46400 1727204559.30511: done checking for max_fail_percentage 46400 1727204559.30511: checking to see if all hosts have failed and the running result is not ok 46400 1727204559.30512: done checking to see if all hosts have failed 46400 1727204559.30513: getting the remaining hosts for this loop 46400 1727204559.30515: done getting the remaining hosts for this loop 46400 1727204559.30519: getting the next task for host managed-node2 46400 1727204559.30530: done getting next task for host managed-node2 46400 1727204559.30535: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204559.30542: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204559.30570: getting variables 46400 1727204559.30572: in VariableManager get_vars() 46400 1727204559.30614: Calling all_inventory to load vars for managed-node2 46400 1727204559.30616: Calling groups_inventory to load vars for managed-node2 46400 1727204559.30619: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204559.30633: Calling all_plugins_play to load vars for managed-node2 46400 1727204559.30637: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204559.30641: Calling groups_plugins_play to load vars for managed-node2 46400 1727204559.32585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204559.34413: done with get_vars() 46400 1727204559.34443: done getting variables 46400 1727204559.34511: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:39 -0400 (0:00:00.058) 0:00:49.630 ***** 46400 1727204559.34550: entering _queue_task() for managed-node2/service 46400 1727204559.35009: worker is 1 (out of 1 available) 46400 1727204559.35023: exiting _queue_task() for managed-node2/service 46400 1727204559.35037: done queuing things up, now waiting for results queue to drain 46400 1727204559.35039: waiting for pending results... 46400 1727204559.35346: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204559.35514: in run() - task 0affcd87-79f5-1303-fda8-000000001100 46400 1727204559.35536: variable 'ansible_search_path' from source: unknown 46400 1727204559.35544: variable 'ansible_search_path' from source: unknown 46400 1727204559.35595: calling self._execute() 46400 1727204559.35702: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.35716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.35731: variable 'omit' from source: magic vars 46400 1727204559.36174: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.36192: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204559.36326: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204559.36538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204559.39509: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204559.39598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204559.39645: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204559.39694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204559.39735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204559.39893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.39978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.40057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.40109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.40129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.40188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.40217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.40247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.40305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.40324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.40373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.40402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.40429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.40476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.40498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.40747: variable 'network_connections' from source: include params 46400 1727204559.40768: variable 'interface' from source: play vars 46400 1727204559.40850: variable 'interface' from source: play vars 46400 1727204559.40931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204559.41112: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204559.41177: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204559.41216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204559.41254: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204559.41305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204559.41333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204559.41373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.41403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204559.41585: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204559.41850: variable 'network_connections' from source: include params 46400 1727204559.41870: variable 'interface' from source: play vars 46400 1727204559.41938: variable 'interface' from source: play vars 46400 1727204559.42089: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204559.42098: when evaluation is False, skipping this task 46400 1727204559.42105: _execute() done 46400 1727204559.42112: dumping result to json 46400 1727204559.42118: done dumping result, returning 46400 1727204559.42128: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001100] 46400 1727204559.42137: sending task result for task 0affcd87-79f5-1303-fda8-000000001100 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204559.42301: no more pending results, returning what we have 46400 1727204559.42306: results queue empty 46400 1727204559.42307: checking for any_errors_fatal 46400 1727204559.42315: done checking for any_errors_fatal 46400 1727204559.42316: checking for max_fail_percentage 46400 1727204559.42318: done checking for max_fail_percentage 46400 1727204559.42319: checking to see if all hosts have failed and the running result is not ok 46400 1727204559.42320: done checking to see if all hosts have failed 46400 1727204559.42321: getting the remaining hosts for this loop 46400 1727204559.42322: done getting the remaining hosts for this loop 46400 1727204559.42327: getting the next task for host managed-node2 46400 1727204559.42336: done getting next task for host managed-node2 46400 1727204559.42341: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204559.42346: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204559.42373: getting variables 46400 1727204559.42375: in VariableManager get_vars() 46400 1727204559.42417: Calling all_inventory to load vars for managed-node2 46400 1727204559.42420: Calling groups_inventory to load vars for managed-node2 46400 1727204559.42423: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204559.42434: Calling all_plugins_play to load vars for managed-node2 46400 1727204559.42437: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204559.42441: Calling groups_plugins_play to load vars for managed-node2 46400 1727204559.43784: done sending task result for task 0affcd87-79f5-1303-fda8-000000001100 46400 1727204559.43788: WORKER PROCESS EXITING 46400 1727204559.44574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204559.46577: done with get_vars() 46400 1727204559.46610: done getting variables 46400 1727204559.46785: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:39 -0400 (0:00:00.122) 0:00:49.752 ***** 46400 1727204559.46821: entering _queue_task() for managed-node2/service 46400 1727204559.47337: worker is 1 (out of 1 available) 46400 1727204559.47350: exiting _queue_task() for managed-node2/service 46400 1727204559.47369: done queuing things up, now waiting for results queue to drain 46400 1727204559.47371: waiting for pending results... 46400 1727204559.48318: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204559.48593: in run() - task 0affcd87-79f5-1303-fda8-000000001101 46400 1727204559.48686: variable 'ansible_search_path' from source: unknown 46400 1727204559.48694: variable 'ansible_search_path' from source: unknown 46400 1727204559.48733: calling self._execute() 46400 1727204559.48886: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.48971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.48986: variable 'omit' from source: magic vars 46400 1727204559.49411: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.49429: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204559.49645: variable 'network_provider' from source: set_fact 46400 1727204559.49671: variable 'network_state' from source: role '' defaults 46400 1727204559.49688: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204559.49700: variable 'omit' from source: magic vars 46400 1727204559.49772: variable 'omit' from source: magic vars 46400 1727204559.49805: variable 'network_service_name' from source: role '' defaults 46400 1727204559.49882: variable 'network_service_name' from source: role '' defaults 46400 1727204559.49999: variable '__network_provider_setup' from source: role '' defaults 46400 1727204559.50010: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204559.50085: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204559.50101: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204559.50168: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204559.50408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204559.54501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204559.54576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204559.54632: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204559.54681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204559.54711: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204559.54798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.54833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.54871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.54917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.54937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.54995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.55024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.55054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.55106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.55126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.55384: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204559.55514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.55547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.55582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.55631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.55653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.55755: variable 'ansible_python' from source: facts 46400 1727204559.55783: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204559.55879: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204559.55966: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204559.56094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.56121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.56149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.56197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.56214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.56268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204559.56308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204559.56337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.56387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204559.56409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204559.56557: variable 'network_connections' from source: include params 46400 1727204559.56575: variable 'interface' from source: play vars 46400 1727204559.56655: variable 'interface' from source: play vars 46400 1727204559.56778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204559.56977: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204559.57045: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204559.57097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204559.57153: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204559.57224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204559.57263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204559.57306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204559.57344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204559.57424: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204559.57755: variable 'network_connections' from source: include params 46400 1727204559.57773: variable 'interface' from source: play vars 46400 1727204559.57853: variable 'interface' from source: play vars 46400 1727204559.57911: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204559.58003: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204559.58325: variable 'network_connections' from source: include params 46400 1727204559.58335: variable 'interface' from source: play vars 46400 1727204559.58415: variable 'interface' from source: play vars 46400 1727204559.58444: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204559.58531: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204559.58848: variable 'network_connections' from source: include params 46400 1727204559.58859: variable 'interface' from source: play vars 46400 1727204559.58942: variable 'interface' from source: play vars 46400 1727204559.59649: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204559.59870: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204559.59883: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204559.59950: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204559.60488: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204559.61589: variable 'network_connections' from source: include params 46400 1727204559.61669: variable 'interface' from source: play vars 46400 1727204559.61732: variable 'interface' from source: play vars 46400 1727204559.61781: variable 'ansible_distribution' from source: facts 46400 1727204559.61790: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.61800: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.61911: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204559.62182: variable 'ansible_distribution' from source: facts 46400 1727204559.62321: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.62332: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.62345: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204559.62701: variable 'ansible_distribution' from source: facts 46400 1727204559.62712: variable '__network_rh_distros' from source: role '' defaults 46400 1727204559.62722: variable 'ansible_distribution_major_version' from source: facts 46400 1727204559.62772: variable 'network_provider' from source: set_fact 46400 1727204559.62800: variable 'omit' from source: magic vars 46400 1727204559.62834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204559.62874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204559.62898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204559.62921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204559.62936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204559.62976: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204559.62986: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.62994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.63098: Set connection var ansible_shell_type to sh 46400 1727204559.63113: Set connection var ansible_shell_executable to /bin/sh 46400 1727204559.63123: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204559.63132: Set connection var ansible_connection to ssh 46400 1727204559.63140: Set connection var ansible_pipelining to False 46400 1727204559.63149: Set connection var ansible_timeout to 10 46400 1727204559.63190: variable 'ansible_shell_executable' from source: unknown 46400 1727204559.63199: variable 'ansible_connection' from source: unknown 46400 1727204559.63206: variable 'ansible_module_compression' from source: unknown 46400 1727204559.63213: variable 'ansible_shell_type' from source: unknown 46400 1727204559.63219: variable 'ansible_shell_executable' from source: unknown 46400 1727204559.63225: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204559.63233: variable 'ansible_pipelining' from source: unknown 46400 1727204559.63241: variable 'ansible_timeout' from source: unknown 46400 1727204559.63249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204559.63368: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204559.63390: variable 'omit' from source: magic vars 46400 1727204559.63402: starting attempt loop 46400 1727204559.63411: running the handler 46400 1727204559.63496: variable 'ansible_facts' from source: unknown 46400 1727204559.64285: _low_level_execute_command(): starting 46400 1727204559.64298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204559.65083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204559.65100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.65116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.65135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.65187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.65199: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204559.65214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.65233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204559.65246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204559.65262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204559.65279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.65294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.65310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.65323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.65334: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204559.65348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.65430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204559.65448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204559.65469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204559.65559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204559.67221: stdout chunk (state=3): >>>/root <<< 46400 1727204559.67326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204559.67385: stderr chunk (state=3): >>><<< 46400 1727204559.67388: stdout chunk (state=3): >>><<< 46400 1727204559.67407: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204559.67417: _low_level_execute_command(): starting 46400 1727204559.67423: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209 `" && echo ansible-tmp-1727204559.6740716-49789-178227490932209="` echo /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209 `" ) && sleep 0' 46400 1727204559.67889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.67895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.67918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.67949: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204559.67952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204559.67954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.67957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.68018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204559.68033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204559.68045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204559.68109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204559.69982: stdout chunk (state=3): >>>ansible-tmp-1727204559.6740716-49789-178227490932209=/root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209 <<< 46400 1727204559.70095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204559.70166: stderr chunk (state=3): >>><<< 46400 1727204559.70170: stdout chunk (state=3): >>><<< 46400 1727204559.70183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204559.6740716-49789-178227490932209=/root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204559.70212: variable 'ansible_module_compression' from source: unknown 46400 1727204559.70253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204559.70301: variable 'ansible_facts' from source: unknown 46400 1727204559.70431: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/AnsiballZ_systemd.py 46400 1727204559.70550: Sending initial data 46400 1727204559.70553: Sent initial data (156 bytes) 46400 1727204559.71544: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204559.71567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.71585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.71610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.71658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.71678: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204559.71694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.71719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204559.71736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204559.71748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204559.71763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.71780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.71797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.71811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.71825: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204559.71841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.71924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204559.71951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204559.71976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204559.72051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204559.73815: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204559.73870: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204559.73918: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpwyonv5nx /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/AnsiballZ_systemd.py <<< 46400 1727204559.73932: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204559.76495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204559.76803: stderr chunk (state=3): >>><<< 46400 1727204559.76806: stdout chunk (state=3): >>><<< 46400 1727204559.76808: done transferring module to remote 46400 1727204559.76810: _low_level_execute_command(): starting 46400 1727204559.76812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/ /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/AnsiballZ_systemd.py && sleep 0' 46400 1727204559.77465: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204559.77484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.77507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.77526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.77573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.77585: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204559.77605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.77627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204559.77639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204559.77649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204559.77665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.77696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204559.77720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.77735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.77746: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204559.77758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.77836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204559.77865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204559.77883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204559.77962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204559.79722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204559.79847: stderr chunk (state=3): >>><<< 46400 1727204559.79928: stdout chunk (state=3): >>><<< 46400 1727204559.80082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204559.80086: _low_level_execute_command(): starting 46400 1727204559.80088: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/AnsiballZ_systemd.py && sleep 0' 46400 1727204559.80913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204559.80919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204559.80940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204559.80943: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204559.82311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.07510: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204560.07589: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "7000064", "MemoryAvailable": "infinity", "CPUUsageNSec": "2101623000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204560.07594: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204560.08995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204560.09078: stderr chunk (state=3): >>><<< 46400 1727204560.09082: stdout chunk (state=3): >>><<< 46400 1727204560.09374: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "7000064", "MemoryAvailable": "infinity", "CPUUsageNSec": "2101623000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204560.09386: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204560.09389: _low_level_execute_command(): starting 46400 1727204560.09392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204559.6740716-49789-178227490932209/ > /dev/null 2>&1 && sleep 0' 46400 1727204560.11231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204560.11255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.11277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.11385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.11429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.11442: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204560.11458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.11487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204560.11499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204560.11510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204560.11522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.11536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.11553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.11572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.11591: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204560.11606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.11694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204560.11821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.11838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.11913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.13794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204560.13798: stdout chunk (state=3): >>><<< 46400 1727204560.13801: stderr chunk (state=3): >>><<< 46400 1727204560.13881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204560.13885: handler run complete 46400 1727204560.14074: attempt loop complete, returning result 46400 1727204560.14078: _execute() done 46400 1727204560.14080: dumping result to json 46400 1727204560.14082: done dumping result, returning 46400 1727204560.14085: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000001101] 46400 1727204560.14087: sending task result for task 0affcd87-79f5-1303-fda8-000000001101 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204560.14376: no more pending results, returning what we have 46400 1727204560.14381: results queue empty 46400 1727204560.14382: checking for any_errors_fatal 46400 1727204560.14388: done checking for any_errors_fatal 46400 1727204560.14389: checking for max_fail_percentage 46400 1727204560.14391: done checking for max_fail_percentage 46400 1727204560.14392: checking to see if all hosts have failed and the running result is not ok 46400 1727204560.14393: done checking to see if all hosts have failed 46400 1727204560.14394: getting the remaining hosts for this loop 46400 1727204560.14396: done getting the remaining hosts for this loop 46400 1727204560.14400: getting the next task for host managed-node2 46400 1727204560.14409: done getting next task for host managed-node2 46400 1727204560.14414: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204560.14418: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204560.14432: getting variables 46400 1727204560.14434: in VariableManager get_vars() 46400 1727204560.14476: Calling all_inventory to load vars for managed-node2 46400 1727204560.14479: Calling groups_inventory to load vars for managed-node2 46400 1727204560.14482: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204560.14493: Calling all_plugins_play to load vars for managed-node2 46400 1727204560.14496: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204560.14499: Calling groups_plugins_play to load vars for managed-node2 46400 1727204560.15400: done sending task result for task 0affcd87-79f5-1303-fda8-000000001101 46400 1727204560.15407: WORKER PROCESS EXITING 46400 1727204560.18008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204560.23979: done with get_vars() 46400 1727204560.24013: done getting variables 46400 1727204560.24091: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:40 -0400 (0:00:00.773) 0:00:50.525 ***** 46400 1727204560.24127: entering _queue_task() for managed-node2/service 46400 1727204560.24520: worker is 1 (out of 1 available) 46400 1727204560.24535: exiting _queue_task() for managed-node2/service 46400 1727204560.24548: done queuing things up, now waiting for results queue to drain 46400 1727204560.24550: waiting for pending results... 46400 1727204560.25705: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204560.25884: in run() - task 0affcd87-79f5-1303-fda8-000000001102 46400 1727204560.25906: variable 'ansible_search_path' from source: unknown 46400 1727204560.25914: variable 'ansible_search_path' from source: unknown 46400 1727204560.25969: calling self._execute() 46400 1727204560.26085: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.26097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.26112: variable 'omit' from source: magic vars 46400 1727204560.26525: variable 'ansible_distribution_major_version' from source: facts 46400 1727204560.26541: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204560.26675: variable 'network_provider' from source: set_fact 46400 1727204560.26687: Evaluated conditional (network_provider == "nm"): True 46400 1727204560.26795: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204560.26904: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204560.27098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204560.30480: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204560.30556: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204560.30610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204560.30649: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204560.30691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204560.30795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204560.30828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204560.30863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204560.30916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204560.30935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204560.30989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204560.31023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204560.31054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204560.31104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204560.31130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204560.31179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204560.31208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204560.31274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204560.31338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204560.31375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204560.31777: variable 'network_connections' from source: include params 46400 1727204560.31806: variable 'interface' from source: play vars 46400 1727204560.31967: variable 'interface' from source: play vars 46400 1727204560.32124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204560.32387: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204560.32429: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204560.32478: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204560.32511: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204560.32572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204560.32600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204560.32686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204560.32742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204560.32956: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204560.33844: variable 'network_connections' from source: include params 46400 1727204560.33869: variable 'interface' from source: play vars 46400 1727204560.33946: variable 'interface' from source: play vars 46400 1727204560.34014: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204560.34041: when evaluation is False, skipping this task 46400 1727204560.34059: _execute() done 46400 1727204560.34090: dumping result to json 46400 1727204560.34108: done dumping result, returning 46400 1727204560.34149: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000001102] 46400 1727204560.34174: sending task result for task 0affcd87-79f5-1303-fda8-000000001102 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204560.34486: no more pending results, returning what we have 46400 1727204560.34491: results queue empty 46400 1727204560.34492: checking for any_errors_fatal 46400 1727204560.34540: done checking for any_errors_fatal 46400 1727204560.34595: checking for max_fail_percentage 46400 1727204560.34598: done checking for max_fail_percentage 46400 1727204560.34599: checking to see if all hosts have failed and the running result is not ok 46400 1727204560.34600: done checking to see if all hosts have failed 46400 1727204560.34601: getting the remaining hosts for this loop 46400 1727204560.34602: done getting the remaining hosts for this loop 46400 1727204560.34623: getting the next task for host managed-node2 46400 1727204560.34640: done getting next task for host managed-node2 46400 1727204560.34645: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204560.34661: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204560.34683: getting variables 46400 1727204560.34686: in VariableManager get_vars() 46400 1727204560.34773: Calling all_inventory to load vars for managed-node2 46400 1727204560.34776: Calling groups_inventory to load vars for managed-node2 46400 1727204560.34779: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204560.34804: Calling all_plugins_play to load vars for managed-node2 46400 1727204560.34809: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204560.34813: Calling groups_plugins_play to load vars for managed-node2 46400 1727204560.35927: done sending task result for task 0affcd87-79f5-1303-fda8-000000001102 46400 1727204560.35932: WORKER PROCESS EXITING 46400 1727204560.36920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204560.39218: done with get_vars() 46400 1727204560.39258: done getting variables 46400 1727204560.39449: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:40 -0400 (0:00:00.153) 0:00:50.679 ***** 46400 1727204560.39509: entering _queue_task() for managed-node2/service 46400 1727204560.40030: worker is 1 (out of 1 available) 46400 1727204560.40048: exiting _queue_task() for managed-node2/service 46400 1727204560.40074: done queuing things up, now waiting for results queue to drain 46400 1727204560.40076: waiting for pending results... 46400 1727204560.40481: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204560.40729: in run() - task 0affcd87-79f5-1303-fda8-000000001103 46400 1727204560.40795: variable 'ansible_search_path' from source: unknown 46400 1727204560.40817: variable 'ansible_search_path' from source: unknown 46400 1727204560.40870: calling self._execute() 46400 1727204560.40988: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.41012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.41031: variable 'omit' from source: magic vars 46400 1727204560.41441: variable 'ansible_distribution_major_version' from source: facts 46400 1727204560.41466: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204560.41604: variable 'network_provider' from source: set_fact 46400 1727204560.41621: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204560.41629: when evaluation is False, skipping this task 46400 1727204560.41635: _execute() done 46400 1727204560.41643: dumping result to json 46400 1727204560.41651: done dumping result, returning 46400 1727204560.41665: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000001103] 46400 1727204560.41679: sending task result for task 0affcd87-79f5-1303-fda8-000000001103 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204560.41837: no more pending results, returning what we have 46400 1727204560.41842: results queue empty 46400 1727204560.41843: checking for any_errors_fatal 46400 1727204560.41852: done checking for any_errors_fatal 46400 1727204560.41853: checking for max_fail_percentage 46400 1727204560.41855: done checking for max_fail_percentage 46400 1727204560.41856: checking to see if all hosts have failed and the running result is not ok 46400 1727204560.41857: done checking to see if all hosts have failed 46400 1727204560.41858: getting the remaining hosts for this loop 46400 1727204560.41862: done getting the remaining hosts for this loop 46400 1727204560.41869: getting the next task for host managed-node2 46400 1727204560.41879: done getting next task for host managed-node2 46400 1727204560.41884: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204560.41890: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204560.41951: getting variables 46400 1727204560.41953: in VariableManager get_vars() 46400 1727204560.42029: Calling all_inventory to load vars for managed-node2 46400 1727204560.42032: Calling groups_inventory to load vars for managed-node2 46400 1727204560.42035: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204560.42050: Calling all_plugins_play to load vars for managed-node2 46400 1727204560.42053: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204560.42056: Calling groups_plugins_play to load vars for managed-node2 46400 1727204560.43172: done sending task result for task 0affcd87-79f5-1303-fda8-000000001103 46400 1727204560.43176: WORKER PROCESS EXITING 46400 1727204560.44794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204560.48744: done with get_vars() 46400 1727204560.48934: done getting variables 46400 1727204560.49119: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:40 -0400 (0:00:00.096) 0:00:50.776 ***** 46400 1727204560.49164: entering _queue_task() for managed-node2/copy 46400 1727204560.49813: worker is 1 (out of 1 available) 46400 1727204560.49831: exiting _queue_task() for managed-node2/copy 46400 1727204560.49845: done queuing things up, now waiting for results queue to drain 46400 1727204560.49847: waiting for pending results... 46400 1727204560.50277: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204560.50526: in run() - task 0affcd87-79f5-1303-fda8-000000001104 46400 1727204560.50547: variable 'ansible_search_path' from source: unknown 46400 1727204560.50556: variable 'ansible_search_path' from source: unknown 46400 1727204560.50648: calling self._execute() 46400 1727204560.51026: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.51039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.51054: variable 'omit' from source: magic vars 46400 1727204560.51752: variable 'ansible_distribution_major_version' from source: facts 46400 1727204560.51774: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204560.51967: variable 'network_provider' from source: set_fact 46400 1727204560.51980: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204560.51988: when evaluation is False, skipping this task 46400 1727204560.51995: _execute() done 46400 1727204560.52002: dumping result to json 46400 1727204560.52009: done dumping result, returning 46400 1727204560.52021: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000001104] 46400 1727204560.52113: sending task result for task 0affcd87-79f5-1303-fda8-000000001104 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204560.52281: no more pending results, returning what we have 46400 1727204560.52285: results queue empty 46400 1727204560.52287: checking for any_errors_fatal 46400 1727204560.52297: done checking for any_errors_fatal 46400 1727204560.52298: checking for max_fail_percentage 46400 1727204560.52300: done checking for max_fail_percentage 46400 1727204560.52301: checking to see if all hosts have failed and the running result is not ok 46400 1727204560.52302: done checking to see if all hosts have failed 46400 1727204560.52302: getting the remaining hosts for this loop 46400 1727204560.52304: done getting the remaining hosts for this loop 46400 1727204560.52309: getting the next task for host managed-node2 46400 1727204560.52317: done getting next task for host managed-node2 46400 1727204560.52322: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204560.52328: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204560.52356: getting variables 46400 1727204560.52358: in VariableManager get_vars() 46400 1727204560.52405: Calling all_inventory to load vars for managed-node2 46400 1727204560.52408: Calling groups_inventory to load vars for managed-node2 46400 1727204560.52411: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204560.52424: Calling all_plugins_play to load vars for managed-node2 46400 1727204560.52428: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204560.52431: Calling groups_plugins_play to load vars for managed-node2 46400 1727204560.53458: done sending task result for task 0affcd87-79f5-1303-fda8-000000001104 46400 1727204560.53462: WORKER PROCESS EXITING 46400 1727204560.55861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204560.58179: done with get_vars() 46400 1727204560.58210: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:40 -0400 (0:00:00.092) 0:00:50.868 ***** 46400 1727204560.58442: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204560.58845: worker is 1 (out of 1 available) 46400 1727204560.58859: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204560.58875: done queuing things up, now waiting for results queue to drain 46400 1727204560.58876: waiting for pending results... 46400 1727204560.59197: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204560.59349: in run() - task 0affcd87-79f5-1303-fda8-000000001105 46400 1727204560.59374: variable 'ansible_search_path' from source: unknown 46400 1727204560.59381: variable 'ansible_search_path' from source: unknown 46400 1727204560.59420: calling self._execute() 46400 1727204560.59536: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.59558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.59575: variable 'omit' from source: magic vars 46400 1727204560.59973: variable 'ansible_distribution_major_version' from source: facts 46400 1727204560.59999: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204560.60011: variable 'omit' from source: magic vars 46400 1727204560.60079: variable 'omit' from source: magic vars 46400 1727204560.60256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204560.62629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204560.62703: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204560.62754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204560.62793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204560.62835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204560.62942: variable 'network_provider' from source: set_fact 46400 1727204560.63089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204560.63121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204560.63163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204560.63209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204560.63227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204560.63319: variable 'omit' from source: magic vars 46400 1727204560.63443: variable 'omit' from source: magic vars 46400 1727204560.63560: variable 'network_connections' from source: include params 46400 1727204560.63584: variable 'interface' from source: play vars 46400 1727204560.63656: variable 'interface' from source: play vars 46400 1727204560.63836: variable 'omit' from source: magic vars 46400 1727204560.63849: variable '__lsr_ansible_managed' from source: task vars 46400 1727204560.63923: variable '__lsr_ansible_managed' from source: task vars 46400 1727204560.64369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204560.64647: Loaded config def from plugin (lookup/template) 46400 1727204560.64656: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204560.64695: File lookup term: get_ansible_managed.j2 46400 1727204560.64702: variable 'ansible_search_path' from source: unknown 46400 1727204560.64718: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204560.64737: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204560.64767: variable 'ansible_search_path' from source: unknown 46400 1727204560.71741: variable 'ansible_managed' from source: unknown 46400 1727204560.71903: variable 'omit' from source: magic vars 46400 1727204560.71942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204560.71978: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204560.72005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204560.72029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204560.72049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204560.72088: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204560.72097: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.72106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.72212: Set connection var ansible_shell_type to sh 46400 1727204560.72231: Set connection var ansible_shell_executable to /bin/sh 46400 1727204560.72240: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204560.72249: Set connection var ansible_connection to ssh 46400 1727204560.72257: Set connection var ansible_pipelining to False 46400 1727204560.72272: Set connection var ansible_timeout to 10 46400 1727204560.72308: variable 'ansible_shell_executable' from source: unknown 46400 1727204560.72316: variable 'ansible_connection' from source: unknown 46400 1727204560.72322: variable 'ansible_module_compression' from source: unknown 46400 1727204560.72328: variable 'ansible_shell_type' from source: unknown 46400 1727204560.72340: variable 'ansible_shell_executable' from source: unknown 46400 1727204560.72348: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204560.72355: variable 'ansible_pipelining' from source: unknown 46400 1727204560.72362: variable 'ansible_timeout' from source: unknown 46400 1727204560.72375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204560.72600: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204560.72671: variable 'omit' from source: magic vars 46400 1727204560.72683: starting attempt loop 46400 1727204560.72693: running the handler 46400 1727204560.72713: _low_level_execute_command(): starting 46400 1727204560.72724: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204560.74571: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.74574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.74812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.74815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.74817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.74894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204560.74897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.74986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.75245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.76898: stdout chunk (state=3): >>>/root <<< 46400 1727204560.77004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204560.77094: stderr chunk (state=3): >>><<< 46400 1727204560.77097: stdout chunk (state=3): >>><<< 46400 1727204560.77217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204560.77221: _low_level_execute_command(): starting 46400 1727204560.77224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529 `" && echo ansible-tmp-1727204560.7711897-49881-192177425979529="` echo /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529 `" ) && sleep 0' 46400 1727204560.78556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.78560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.78804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204560.78808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.78810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.78882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.78989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.79233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.81089: stdout chunk (state=3): >>>ansible-tmp-1727204560.7711897-49881-192177425979529=/root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529 <<< 46400 1727204560.81203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204560.81281: stderr chunk (state=3): >>><<< 46400 1727204560.81285: stdout chunk (state=3): >>><<< 46400 1727204560.81575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204560.7711897-49881-192177425979529=/root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204560.81578: variable 'ansible_module_compression' from source: unknown 46400 1727204560.81581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204560.81583: variable 'ansible_facts' from source: unknown 46400 1727204560.81586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/AnsiballZ_network_connections.py 46400 1727204560.82128: Sending initial data 46400 1727204560.82132: Sent initial data (168 bytes) 46400 1727204560.84607: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204560.84907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.84924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.84942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.84994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.85010: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204560.85028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.85045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204560.85057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204560.85070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204560.85082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.85094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.85114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.85126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.85141: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204560.85154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.85299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204560.85582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.85604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.85672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.87360: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204560.87410: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204560.87431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpcnj32n0i /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/AnsiballZ_network_connections.py <<< 46400 1727204560.87498: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204560.89687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204560.89762: stderr chunk (state=3): >>><<< 46400 1727204560.89768: stdout chunk (state=3): >>><<< 46400 1727204560.89792: done transferring module to remote 46400 1727204560.89803: _low_level_execute_command(): starting 46400 1727204560.89808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/ /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/AnsiballZ_network_connections.py && sleep 0' 46400 1727204560.91406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204560.91586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.91596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.91610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.91649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.91657: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204560.91671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.91686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204560.91694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204560.91701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204560.91708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.91718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.91729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.91737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.91743: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204560.91752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.91828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204560.91994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.92004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.92236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204560.93982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204560.94002: stderr chunk (state=3): >>><<< 46400 1727204560.94006: stdout chunk (state=3): >>><<< 46400 1727204560.94025: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204560.94029: _low_level_execute_command(): starting 46400 1727204560.94031: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/AnsiballZ_network_connections.py && sleep 0' 46400 1727204560.95920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204560.95929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.95939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.95953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.95997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.96180: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204560.96190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.96204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204560.96212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204560.96218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204560.96226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204560.96237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204560.96246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204560.96254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204560.96260: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204560.96275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204560.96363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204560.96369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204560.96376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204560.96542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.21984: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204561.24021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204561.24026: stdout chunk (state=3): >>><<< 46400 1727204561.24031: stderr chunk (state=3): >>><<< 46400 1727204561.24056: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204561.24102: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204561.24111: _low_level_execute_command(): starting 46400 1727204561.24116: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204560.7711897-49881-192177425979529/ > /dev/null 2>&1 && sleep 0' 46400 1727204561.24777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204561.24781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.24793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.24804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.24842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.24849: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204561.24858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.24878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204561.24885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204561.24893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204561.24903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.24906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.24917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.24924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.24930: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204561.24941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.25029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204561.25033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.25040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.25680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.27061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204561.27075: stdout chunk (state=3): >>><<< 46400 1727204561.27078: stderr chunk (state=3): >>><<< 46400 1727204561.27093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204561.27102: handler run complete 46400 1727204561.27136: attempt loop complete, returning result 46400 1727204561.27140: _execute() done 46400 1727204561.27142: dumping result to json 46400 1727204561.27147: done dumping result, returning 46400 1727204561.27156: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000001105] 46400 1727204561.27166: sending task result for task 0affcd87-79f5-1303-fda8-000000001105 46400 1727204561.27298: done sending task result for task 0affcd87-79f5-1303-fda8-000000001105 46400 1727204561.27301: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 46400 1727204561.27403: no more pending results, returning what we have 46400 1727204561.27406: results queue empty 46400 1727204561.27407: checking for any_errors_fatal 46400 1727204561.27413: done checking for any_errors_fatal 46400 1727204561.27414: checking for max_fail_percentage 46400 1727204561.27415: done checking for max_fail_percentage 46400 1727204561.27416: checking to see if all hosts have failed and the running result is not ok 46400 1727204561.27417: done checking to see if all hosts have failed 46400 1727204561.27418: getting the remaining hosts for this loop 46400 1727204561.27420: done getting the remaining hosts for this loop 46400 1727204561.27424: getting the next task for host managed-node2 46400 1727204561.27430: done getting next task for host managed-node2 46400 1727204561.27434: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204561.27439: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204561.27451: getting variables 46400 1727204561.27453: in VariableManager get_vars() 46400 1727204561.27491: Calling all_inventory to load vars for managed-node2 46400 1727204561.27493: Calling groups_inventory to load vars for managed-node2 46400 1727204561.27495: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204561.27505: Calling all_plugins_play to load vars for managed-node2 46400 1727204561.27507: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204561.27510: Calling groups_plugins_play to load vars for managed-node2 46400 1727204561.30643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204561.33433: done with get_vars() 46400 1727204561.33475: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:41 -0400 (0:00:00.751) 0:00:51.620 ***** 46400 1727204561.33574: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204561.33942: worker is 1 (out of 1 available) 46400 1727204561.33955: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204561.33971: done queuing things up, now waiting for results queue to drain 46400 1727204561.33973: waiting for pending results... 46400 1727204561.34424: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204561.34593: in run() - task 0affcd87-79f5-1303-fda8-000000001106 46400 1727204561.34613: variable 'ansible_search_path' from source: unknown 46400 1727204561.34621: variable 'ansible_search_path' from source: unknown 46400 1727204561.34692: calling self._execute() 46400 1727204561.34830: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.34842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.34855: variable 'omit' from source: magic vars 46400 1727204561.35279: variable 'ansible_distribution_major_version' from source: facts 46400 1727204561.35295: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204561.35444: variable 'network_state' from source: role '' defaults 46400 1727204561.35459: Evaluated conditional (network_state != {}): False 46400 1727204561.35468: when evaluation is False, skipping this task 46400 1727204561.35474: _execute() done 46400 1727204561.35482: dumping result to json 46400 1727204561.35488: done dumping result, returning 46400 1727204561.35498: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000001106] 46400 1727204561.35508: sending task result for task 0affcd87-79f5-1303-fda8-000000001106 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204561.35680: no more pending results, returning what we have 46400 1727204561.35685: results queue empty 46400 1727204561.35686: checking for any_errors_fatal 46400 1727204561.35700: done checking for any_errors_fatal 46400 1727204561.35701: checking for max_fail_percentage 46400 1727204561.35703: done checking for max_fail_percentage 46400 1727204561.35704: checking to see if all hosts have failed and the running result is not ok 46400 1727204561.35705: done checking to see if all hosts have failed 46400 1727204561.35706: getting the remaining hosts for this loop 46400 1727204561.35708: done getting the remaining hosts for this loop 46400 1727204561.35712: getting the next task for host managed-node2 46400 1727204561.35721: done getting next task for host managed-node2 46400 1727204561.35726: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204561.35732: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204561.35756: getting variables 46400 1727204561.35758: in VariableManager get_vars() 46400 1727204561.35800: Calling all_inventory to load vars for managed-node2 46400 1727204561.35802: Calling groups_inventory to load vars for managed-node2 46400 1727204561.35805: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204561.35818: Calling all_plugins_play to load vars for managed-node2 46400 1727204561.35821: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204561.35824: Calling groups_plugins_play to load vars for managed-node2 46400 1727204561.36814: done sending task result for task 0affcd87-79f5-1303-fda8-000000001106 46400 1727204561.36818: WORKER PROCESS EXITING 46400 1727204561.37886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204561.39907: done with get_vars() 46400 1727204561.39933: done getting variables 46400 1727204561.40002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:41 -0400 (0:00:00.064) 0:00:51.684 ***** 46400 1727204561.40043: entering _queue_task() for managed-node2/debug 46400 1727204561.40406: worker is 1 (out of 1 available) 46400 1727204561.40419: exiting _queue_task() for managed-node2/debug 46400 1727204561.40436: done queuing things up, now waiting for results queue to drain 46400 1727204561.40439: waiting for pending results... 46400 1727204561.40756: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204561.40924: in run() - task 0affcd87-79f5-1303-fda8-000000001107 46400 1727204561.40950: variable 'ansible_search_path' from source: unknown 46400 1727204561.40959: variable 'ansible_search_path' from source: unknown 46400 1727204561.41010: calling self._execute() 46400 1727204561.41119: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.41130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.41162: variable 'omit' from source: magic vars 46400 1727204561.42067: variable 'ansible_distribution_major_version' from source: facts 46400 1727204561.42125: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204561.42175: variable 'omit' from source: magic vars 46400 1727204561.42283: variable 'omit' from source: magic vars 46400 1727204561.42322: variable 'omit' from source: magic vars 46400 1727204561.42380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204561.42418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204561.42452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204561.42482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.42570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.42574: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204561.42577: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.42581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.42930: Set connection var ansible_shell_type to sh 46400 1727204561.42939: Set connection var ansible_shell_executable to /bin/sh 46400 1727204561.42944: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204561.42949: Set connection var ansible_connection to ssh 46400 1727204561.42955: Set connection var ansible_pipelining to False 46400 1727204561.42960: Set connection var ansible_timeout to 10 46400 1727204561.43031: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.43052: variable 'ansible_connection' from source: unknown 46400 1727204561.43059: variable 'ansible_module_compression' from source: unknown 46400 1727204561.43074: variable 'ansible_shell_type' from source: unknown 46400 1727204561.43080: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.43086: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.43115: variable 'ansible_pipelining' from source: unknown 46400 1727204561.43204: variable 'ansible_timeout' from source: unknown 46400 1727204561.43219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.43860: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204561.43876: variable 'omit' from source: magic vars 46400 1727204561.43881: starting attempt loop 46400 1727204561.43884: running the handler 46400 1727204561.44227: variable '__network_connections_result' from source: set_fact 46400 1727204561.44421: handler run complete 46400 1727204561.44475: attempt loop complete, returning result 46400 1727204561.44483: _execute() done 46400 1727204561.44516: dumping result to json 46400 1727204561.44544: done dumping result, returning 46400 1727204561.44553: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000001107] 46400 1727204561.44566: sending task result for task 0affcd87-79f5-1303-fda8-000000001107 46400 1727204561.44772: done sending task result for task 0affcd87-79f5-1303-fda8-000000001107 46400 1727204561.44778: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024" ] } 46400 1727204561.44922: no more pending results, returning what we have 46400 1727204561.44926: results queue empty 46400 1727204561.44928: checking for any_errors_fatal 46400 1727204561.44936: done checking for any_errors_fatal 46400 1727204561.44937: checking for max_fail_percentage 46400 1727204561.44938: done checking for max_fail_percentage 46400 1727204561.44940: checking to see if all hosts have failed and the running result is not ok 46400 1727204561.44941: done checking to see if all hosts have failed 46400 1727204561.44941: getting the remaining hosts for this loop 46400 1727204561.44943: done getting the remaining hosts for this loop 46400 1727204561.44949: getting the next task for host managed-node2 46400 1727204561.44959: done getting next task for host managed-node2 46400 1727204561.44965: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204561.44971: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204561.44986: getting variables 46400 1727204561.44988: in VariableManager get_vars() 46400 1727204561.45033: Calling all_inventory to load vars for managed-node2 46400 1727204561.45036: Calling groups_inventory to load vars for managed-node2 46400 1727204561.45039: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204561.45077: Calling all_plugins_play to load vars for managed-node2 46400 1727204561.45102: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204561.45107: Calling groups_plugins_play to load vars for managed-node2 46400 1727204561.47152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204561.50210: done with get_vars() 46400 1727204561.50276: done getting variables 46400 1727204561.50427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:41 -0400 (0:00:00.104) 0:00:51.789 ***** 46400 1727204561.50492: entering _queue_task() for managed-node2/debug 46400 1727204561.50966: worker is 1 (out of 1 available) 46400 1727204561.50980: exiting _queue_task() for managed-node2/debug 46400 1727204561.50994: done queuing things up, now waiting for results queue to drain 46400 1727204561.50996: waiting for pending results... 46400 1727204561.51184: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204561.51292: in run() - task 0affcd87-79f5-1303-fda8-000000001108 46400 1727204561.51305: variable 'ansible_search_path' from source: unknown 46400 1727204561.51308: variable 'ansible_search_path' from source: unknown 46400 1727204561.51338: calling self._execute() 46400 1727204561.51414: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.51417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.51426: variable 'omit' from source: magic vars 46400 1727204561.51728: variable 'ansible_distribution_major_version' from source: facts 46400 1727204561.51740: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204561.51745: variable 'omit' from source: magic vars 46400 1727204561.51797: variable 'omit' from source: magic vars 46400 1727204561.51820: variable 'omit' from source: magic vars 46400 1727204561.51857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204561.51889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204561.51907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204561.51920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.51930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.51955: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204561.51958: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.51961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.52031: Set connection var ansible_shell_type to sh 46400 1727204561.52040: Set connection var ansible_shell_executable to /bin/sh 46400 1727204561.52045: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204561.52050: Set connection var ansible_connection to ssh 46400 1727204561.52055: Set connection var ansible_pipelining to False 46400 1727204561.52062: Set connection var ansible_timeout to 10 46400 1727204561.52084: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.52087: variable 'ansible_connection' from source: unknown 46400 1727204561.52090: variable 'ansible_module_compression' from source: unknown 46400 1727204561.52092: variable 'ansible_shell_type' from source: unknown 46400 1727204561.52095: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.52097: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.52099: variable 'ansible_pipelining' from source: unknown 46400 1727204561.52101: variable 'ansible_timeout' from source: unknown 46400 1727204561.52106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.52210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204561.52219: variable 'omit' from source: magic vars 46400 1727204561.52226: starting attempt loop 46400 1727204561.52229: running the handler 46400 1727204561.52281: variable '__network_connections_result' from source: set_fact 46400 1727204561.52336: variable '__network_connections_result' from source: set_fact 46400 1727204561.52477: handler run complete 46400 1727204561.52496: attempt loop complete, returning result 46400 1727204561.52500: _execute() done 46400 1727204561.52503: dumping result to json 46400 1727204561.52505: done dumping result, returning 46400 1727204561.52538: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000001108] 46400 1727204561.52578: sending task result for task 0affcd87-79f5-1303-fda8-000000001108 46400 1727204561.52727: done sending task result for task 0affcd87-79f5-1303-fda8-000000001108 46400 1727204561.52730: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024" ] } } 46400 1727204561.52826: no more pending results, returning what we have 46400 1727204561.52829: results queue empty 46400 1727204561.52830: checking for any_errors_fatal 46400 1727204561.52836: done checking for any_errors_fatal 46400 1727204561.52837: checking for max_fail_percentage 46400 1727204561.52839: done checking for max_fail_percentage 46400 1727204561.52840: checking to see if all hosts have failed and the running result is not ok 46400 1727204561.52841: done checking to see if all hosts have failed 46400 1727204561.52842: getting the remaining hosts for this loop 46400 1727204561.52843: done getting the remaining hosts for this loop 46400 1727204561.52847: getting the next task for host managed-node2 46400 1727204561.52857: done getting next task for host managed-node2 46400 1727204561.52861: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204561.52885: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204561.52898: getting variables 46400 1727204561.52899: in VariableManager get_vars() 46400 1727204561.52930: Calling all_inventory to load vars for managed-node2 46400 1727204561.52932: Calling groups_inventory to load vars for managed-node2 46400 1727204561.52934: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204561.52940: Calling all_plugins_play to load vars for managed-node2 46400 1727204561.52942: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204561.52943: Calling groups_plugins_play to load vars for managed-node2 46400 1727204561.54342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204561.55741: done with get_vars() 46400 1727204561.55773: done getting variables 46400 1727204561.55823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:41 -0400 (0:00:00.053) 0:00:51.843 ***** 46400 1727204561.55850: entering _queue_task() for managed-node2/debug 46400 1727204561.56134: worker is 1 (out of 1 available) 46400 1727204561.56147: exiting _queue_task() for managed-node2/debug 46400 1727204561.56166: done queuing things up, now waiting for results queue to drain 46400 1727204561.56168: waiting for pending results... 46400 1727204561.56379: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204561.56548: in run() - task 0affcd87-79f5-1303-fda8-000000001109 46400 1727204561.56562: variable 'ansible_search_path' from source: unknown 46400 1727204561.56566: variable 'ansible_search_path' from source: unknown 46400 1727204561.56596: calling self._execute() 46400 1727204561.56692: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.56704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.56723: variable 'omit' from source: magic vars 46400 1727204561.57101: variable 'ansible_distribution_major_version' from source: facts 46400 1727204561.57110: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204561.57284: variable 'network_state' from source: role '' defaults 46400 1727204561.57294: Evaluated conditional (network_state != {}): False 46400 1727204561.57297: when evaluation is False, skipping this task 46400 1727204561.57300: _execute() done 46400 1727204561.57302: dumping result to json 46400 1727204561.57304: done dumping result, returning 46400 1727204561.57312: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000001109] 46400 1727204561.57316: sending task result for task 0affcd87-79f5-1303-fda8-000000001109 46400 1727204561.57404: done sending task result for task 0affcd87-79f5-1303-fda8-000000001109 46400 1727204561.57408: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204561.57465: no more pending results, returning what we have 46400 1727204561.57469: results queue empty 46400 1727204561.57470: checking for any_errors_fatal 46400 1727204561.57482: done checking for any_errors_fatal 46400 1727204561.57483: checking for max_fail_percentage 46400 1727204561.57484: done checking for max_fail_percentage 46400 1727204561.57485: checking to see if all hosts have failed and the running result is not ok 46400 1727204561.57486: done checking to see if all hosts have failed 46400 1727204561.57487: getting the remaining hosts for this loop 46400 1727204561.57488: done getting the remaining hosts for this loop 46400 1727204561.57492: getting the next task for host managed-node2 46400 1727204561.57499: done getting next task for host managed-node2 46400 1727204561.57504: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204561.57508: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204561.57583: getting variables 46400 1727204561.57585: in VariableManager get_vars() 46400 1727204561.57643: Calling all_inventory to load vars for managed-node2 46400 1727204561.57645: Calling groups_inventory to load vars for managed-node2 46400 1727204561.57647: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204561.57675: Calling all_plugins_play to load vars for managed-node2 46400 1727204561.57685: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204561.57690: Calling groups_plugins_play to load vars for managed-node2 46400 1727204561.60186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204561.63200: done with get_vars() 46400 1727204561.63236: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:41 -0400 (0:00:00.075) 0:00:51.918 ***** 46400 1727204561.63361: entering _queue_task() for managed-node2/ping 46400 1727204561.63818: worker is 1 (out of 1 available) 46400 1727204561.63836: exiting _queue_task() for managed-node2/ping 46400 1727204561.63859: done queuing things up, now waiting for results queue to drain 46400 1727204561.63861: waiting for pending results... 46400 1727204561.64235: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204561.64471: in run() - task 0affcd87-79f5-1303-fda8-00000000110a 46400 1727204561.64497: variable 'ansible_search_path' from source: unknown 46400 1727204561.64504: variable 'ansible_search_path' from source: unknown 46400 1727204561.64546: calling self._execute() 46400 1727204561.64646: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.64657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.64681: variable 'omit' from source: magic vars 46400 1727204561.65106: variable 'ansible_distribution_major_version' from source: facts 46400 1727204561.65124: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204561.65136: variable 'omit' from source: magic vars 46400 1727204561.65222: variable 'omit' from source: magic vars 46400 1727204561.65282: variable 'omit' from source: magic vars 46400 1727204561.65331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204561.65396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204561.65425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204561.65469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.65486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204561.65527: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204561.65537: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.65545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.65654: Set connection var ansible_shell_type to sh 46400 1727204561.65670: Set connection var ansible_shell_executable to /bin/sh 46400 1727204561.65679: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204561.65705: Set connection var ansible_connection to ssh 46400 1727204561.65742: Set connection var ansible_pipelining to False 46400 1727204561.65756: Set connection var ansible_timeout to 10 46400 1727204561.65795: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.65822: variable 'ansible_connection' from source: unknown 46400 1727204561.65835: variable 'ansible_module_compression' from source: unknown 46400 1727204561.65842: variable 'ansible_shell_type' from source: unknown 46400 1727204561.65847: variable 'ansible_shell_executable' from source: unknown 46400 1727204561.65852: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204561.65857: variable 'ansible_pipelining' from source: unknown 46400 1727204561.65862: variable 'ansible_timeout' from source: unknown 46400 1727204561.65870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204561.66102: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204561.66116: variable 'omit' from source: magic vars 46400 1727204561.66124: starting attempt loop 46400 1727204561.66130: running the handler 46400 1727204561.66168: _low_level_execute_command(): starting 46400 1727204561.66181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204561.67133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204561.67148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.67172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.67192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.67237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.67273: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204561.67290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.67316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204561.67330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204561.67342: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204561.67354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.67390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.67418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.67433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.67461: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204561.67491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.67613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204561.67638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.67666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.67757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.69413: stdout chunk (state=3): >>>/root <<< 46400 1727204561.69584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204561.69623: stderr chunk (state=3): >>><<< 46400 1727204561.69626: stdout chunk (state=3): >>><<< 46400 1727204561.69748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204561.69751: _low_level_execute_command(): starting 46400 1727204561.69754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027 `" && echo ansible-tmp-1727204561.6964827-49919-29132228754027="` echo /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027 `" ) && sleep 0' 46400 1727204561.71075: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.71079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.71234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.71238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.71242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.71642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.71655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.71755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.73592: stdout chunk (state=3): >>>ansible-tmp-1727204561.6964827-49919-29132228754027=/root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027 <<< 46400 1727204561.73784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204561.73860: stderr chunk (state=3): >>><<< 46400 1727204561.73863: stdout chunk (state=3): >>><<< 46400 1727204561.74173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204561.6964827-49919-29132228754027=/root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204561.74176: variable 'ansible_module_compression' from source: unknown 46400 1727204561.74179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204561.74181: variable 'ansible_facts' from source: unknown 46400 1727204561.74182: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/AnsiballZ_ping.py 46400 1727204561.75043: Sending initial data 46400 1727204561.75046: Sent initial data (152 bytes) 46400 1727204561.77653: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204561.77681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.77698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.77718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.77761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.77907: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204561.77923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.77942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204561.77954: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204561.77969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204561.78095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.78114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.78135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.78148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.78159: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204561.78176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.78257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204561.78283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.78334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.78405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.80122: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204561.80147: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204561.80192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp_0_qyacn /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/AnsiballZ_ping.py <<< 46400 1727204561.80228: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204561.81409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204561.81590: stderr chunk (state=3): >>><<< 46400 1727204561.81594: stdout chunk (state=3): >>><<< 46400 1727204561.81596: done transferring module to remote 46400 1727204561.81598: _low_level_execute_command(): starting 46400 1727204561.81601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/ /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/AnsiballZ_ping.py && sleep 0' 46400 1727204561.83205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204561.83353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.83372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.83392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.83441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.83458: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204561.83478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.83496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204561.83509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204561.83521: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204561.83533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.83546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.83571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.83679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.83692: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204561.83707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.83790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204561.83814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.83832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.83909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204561.85719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204561.85723: stdout chunk (state=3): >>><<< 46400 1727204561.85725: stderr chunk (state=3): >>><<< 46400 1727204561.85823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204561.85828: _low_level_execute_command(): starting 46400 1727204561.85830: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/AnsiballZ_ping.py && sleep 0' 46400 1727204561.87331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204561.87486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.87504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.87524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.87573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.87592: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204561.87610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.87629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204561.87641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204561.87653: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204561.87671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204561.87693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204561.87710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204561.87724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204561.87735: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204561.87750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204561.87920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204561.87945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204561.87961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204561.88057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204562.01399: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204562.02077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204562.02138: stdout chunk (state=3): >>><<< 46400 1727204562.02169: stderr chunk (state=3): >>><<< 46400 1727204562.02310: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204562.02315: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204562.02318: _low_level_execute_command(): starting 46400 1727204562.02327: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204561.6964827-49919-29132228754027/ > /dev/null 2>&1 && sleep 0' 46400 1727204562.03875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204562.03892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204562.03907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204562.03926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204562.03974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204562.04097: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204562.04112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204562.04135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204562.04147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204562.04156: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204562.04171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204562.04185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204562.04204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204562.04217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204562.04232: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204562.04245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204562.04422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204562.04440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204562.04454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204562.04533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204562.06400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204562.06406: stdout chunk (state=3): >>><<< 46400 1727204562.06410: stderr chunk (state=3): >>><<< 46400 1727204562.06573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204562.06576: handler run complete 46400 1727204562.06579: attempt loop complete, returning result 46400 1727204562.06581: _execute() done 46400 1727204562.06582: dumping result to json 46400 1727204562.06584: done dumping result, returning 46400 1727204562.06586: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-00000000110a] 46400 1727204562.06588: sending task result for task 0affcd87-79f5-1303-fda8-00000000110a ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204562.06728: no more pending results, returning what we have 46400 1727204562.06733: results queue empty 46400 1727204562.06734: checking for any_errors_fatal 46400 1727204562.06740: done checking for any_errors_fatal 46400 1727204562.06741: checking for max_fail_percentage 46400 1727204562.06743: done checking for max_fail_percentage 46400 1727204562.06744: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.06745: done checking to see if all hosts have failed 46400 1727204562.06745: getting the remaining hosts for this loop 46400 1727204562.06747: done getting the remaining hosts for this loop 46400 1727204562.06750: getting the next task for host managed-node2 46400 1727204562.06763: done getting next task for host managed-node2 46400 1727204562.06766: ^ task is: TASK: meta (role_complete) 46400 1727204562.06771: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.06785: getting variables 46400 1727204562.06787: in VariableManager get_vars() 46400 1727204562.06829: Calling all_inventory to load vars for managed-node2 46400 1727204562.06831: Calling groups_inventory to load vars for managed-node2 46400 1727204562.06834: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.06845: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.06847: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.06850: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.07524: done sending task result for task 0affcd87-79f5-1303-fda8-00000000110a 46400 1727204562.07528: WORKER PROCESS EXITING 46400 1727204562.10089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.12553: done with get_vars() 46400 1727204562.12593: done getting variables 46400 1727204562.12696: done queuing things up, now waiting for results queue to drain 46400 1727204562.12698: results queue empty 46400 1727204562.12699: checking for any_errors_fatal 46400 1727204562.12703: done checking for any_errors_fatal 46400 1727204562.12704: checking for max_fail_percentage 46400 1727204562.12705: done checking for max_fail_percentage 46400 1727204562.12706: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.12707: done checking to see if all hosts have failed 46400 1727204562.12707: getting the remaining hosts for this loop 46400 1727204562.12708: done getting the remaining hosts for this loop 46400 1727204562.12711: getting the next task for host managed-node2 46400 1727204562.12721: done getting next task for host managed-node2 46400 1727204562.12724: ^ task is: TASK: Show result 46400 1727204562.12726: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.12729: getting variables 46400 1727204562.12730: in VariableManager get_vars() 46400 1727204562.12742: Calling all_inventory to load vars for managed-node2 46400 1727204562.12745: Calling groups_inventory to load vars for managed-node2 46400 1727204562.12747: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.12756: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.12758: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.12767: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.19479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.20385: done with get_vars() 46400 1727204562.20406: done getting variables 46400 1727204562.20441: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.571) 0:00:52.489 ***** 46400 1727204562.20466: entering _queue_task() for managed-node2/debug 46400 1727204562.20755: worker is 1 (out of 1 available) 46400 1727204562.20772: exiting _queue_task() for managed-node2/debug 46400 1727204562.20785: done queuing things up, now waiting for results queue to drain 46400 1727204562.20787: waiting for pending results... 46400 1727204562.21049: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204562.21434: in run() - task 0affcd87-79f5-1303-fda8-000000001090 46400 1727204562.21438: variable 'ansible_search_path' from source: unknown 46400 1727204562.21442: variable 'ansible_search_path' from source: unknown 46400 1727204562.21446: calling self._execute() 46400 1727204562.21450: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.21453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.21457: variable 'omit' from source: magic vars 46400 1727204562.22057: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.22065: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204562.22068: variable 'omit' from source: magic vars 46400 1727204562.22071: variable 'omit' from source: magic vars 46400 1727204562.22074: variable 'omit' from source: magic vars 46400 1727204562.22076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204562.22079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204562.22082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204562.22084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204562.22088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204562.22090: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204562.22092: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.22095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.22157: Set connection var ansible_shell_type to sh 46400 1727204562.22172: Set connection var ansible_shell_executable to /bin/sh 46400 1727204562.22175: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204562.22186: Set connection var ansible_connection to ssh 46400 1727204562.22189: Set connection var ansible_pipelining to False 46400 1727204562.22191: Set connection var ansible_timeout to 10 46400 1727204562.22219: variable 'ansible_shell_executable' from source: unknown 46400 1727204562.22223: variable 'ansible_connection' from source: unknown 46400 1727204562.22226: variable 'ansible_module_compression' from source: unknown 46400 1727204562.22229: variable 'ansible_shell_type' from source: unknown 46400 1727204562.22238: variable 'ansible_shell_executable' from source: unknown 46400 1727204562.22240: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.22243: variable 'ansible_pipelining' from source: unknown 46400 1727204562.22245: variable 'ansible_timeout' from source: unknown 46400 1727204562.22250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.22643: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204562.22647: variable 'omit' from source: magic vars 46400 1727204562.22649: starting attempt loop 46400 1727204562.22651: running the handler 46400 1727204562.22653: variable '__network_connections_result' from source: set_fact 46400 1727204562.22655: variable '__network_connections_result' from source: set_fact 46400 1727204562.22657: handler run complete 46400 1727204562.22676: attempt loop complete, returning result 46400 1727204562.22680: _execute() done 46400 1727204562.22683: dumping result to json 46400 1727204562.22687: done dumping result, returning 46400 1727204562.22695: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-000000001090] 46400 1727204562.22701: sending task result for task 0affcd87-79f5-1303-fda8-000000001090 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024" ] } } 46400 1727204562.22886: no more pending results, returning what we have 46400 1727204562.22892: results queue empty 46400 1727204562.22893: checking for any_errors_fatal 46400 1727204562.22896: done checking for any_errors_fatal 46400 1727204562.22897: checking for max_fail_percentage 46400 1727204562.22899: done checking for max_fail_percentage 46400 1727204562.22900: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.22901: done checking to see if all hosts have failed 46400 1727204562.22901: getting the remaining hosts for this loop 46400 1727204562.22903: done getting the remaining hosts for this loop 46400 1727204562.22908: getting the next task for host managed-node2 46400 1727204562.22920: done getting next task for host managed-node2 46400 1727204562.22924: ^ task is: TASK: Include network role 46400 1727204562.22929: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.22934: getting variables 46400 1727204562.22936: in VariableManager get_vars() 46400 1727204562.22975: Calling all_inventory to load vars for managed-node2 46400 1727204562.22979: Calling groups_inventory to load vars for managed-node2 46400 1727204562.22983: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.22998: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.23001: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.23005: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.23671: done sending task result for task 0affcd87-79f5-1303-fda8-000000001090 46400 1727204562.23675: WORKER PROCESS EXITING 46400 1727204562.24338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.25281: done with get_vars() 46400 1727204562.25299: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.049) 0:00:52.538 ***** 46400 1727204562.25377: entering _queue_task() for managed-node2/include_role 46400 1727204562.25616: worker is 1 (out of 1 available) 46400 1727204562.25631: exiting _queue_task() for managed-node2/include_role 46400 1727204562.25646: done queuing things up, now waiting for results queue to drain 46400 1727204562.25647: waiting for pending results... 46400 1727204562.25841: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204562.25956: in run() - task 0affcd87-79f5-1303-fda8-000000001094 46400 1727204562.25972: variable 'ansible_search_path' from source: unknown 46400 1727204562.25976: variable 'ansible_search_path' from source: unknown 46400 1727204562.26005: calling self._execute() 46400 1727204562.26080: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.26084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.26093: variable 'omit' from source: magic vars 46400 1727204562.26425: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.26444: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204562.26455: _execute() done 46400 1727204562.26462: dumping result to json 46400 1727204562.26481: done dumping result, returning 46400 1727204562.26492: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000001094] 46400 1727204562.26503: sending task result for task 0affcd87-79f5-1303-fda8-000000001094 46400 1727204562.26671: no more pending results, returning what we have 46400 1727204562.26678: in VariableManager get_vars() 46400 1727204562.26723: Calling all_inventory to load vars for managed-node2 46400 1727204562.26726: Calling groups_inventory to load vars for managed-node2 46400 1727204562.26731: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.26755: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.26760: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.26765: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.27381: done sending task result for task 0affcd87-79f5-1303-fda8-000000001094 46400 1727204562.27384: WORKER PROCESS EXITING 46400 1727204562.28300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.29232: done with get_vars() 46400 1727204562.29249: variable 'ansible_search_path' from source: unknown 46400 1727204562.29250: variable 'ansible_search_path' from source: unknown 46400 1727204562.29355: variable 'omit' from source: magic vars 46400 1727204562.29392: variable 'omit' from source: magic vars 46400 1727204562.29402: variable 'omit' from source: magic vars 46400 1727204562.29404: we have included files to process 46400 1727204562.29405: generating all_blocks data 46400 1727204562.29406: done generating all_blocks data 46400 1727204562.29410: processing included file: fedora.linux_system_roles.network 46400 1727204562.29424: in VariableManager get_vars() 46400 1727204562.29434: done with get_vars() 46400 1727204562.29454: in VariableManager get_vars() 46400 1727204562.29467: done with get_vars() 46400 1727204562.29497: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204562.29575: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204562.29628: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204562.29907: in VariableManager get_vars() 46400 1727204562.29925: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204562.31800: iterating over new_blocks loaded from include file 46400 1727204562.31803: in VariableManager get_vars() 46400 1727204562.31831: done with get_vars() 46400 1727204562.31833: filtering new block on tags 46400 1727204562.32211: done filtering new block on tags 46400 1727204562.32220: in VariableManager get_vars() 46400 1727204562.32238: done with get_vars() 46400 1727204562.32240: filtering new block on tags 46400 1727204562.32267: done filtering new block on tags 46400 1727204562.32272: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204562.32279: extending task lists for all hosts with included blocks 46400 1727204562.32437: done extending task lists 46400 1727204562.32439: done processing included files 46400 1727204562.32440: results queue empty 46400 1727204562.32443: checking for any_errors_fatal 46400 1727204562.32451: done checking for any_errors_fatal 46400 1727204562.32453: checking for max_fail_percentage 46400 1727204562.32457: done checking for max_fail_percentage 46400 1727204562.32458: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.32459: done checking to see if all hosts have failed 46400 1727204562.32459: getting the remaining hosts for this loop 46400 1727204562.32461: done getting the remaining hosts for this loop 46400 1727204562.32465: getting the next task for host managed-node2 46400 1727204562.32470: done getting next task for host managed-node2 46400 1727204562.32473: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204562.32478: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.32493: getting variables 46400 1727204562.32494: in VariableManager get_vars() 46400 1727204562.32509: Calling all_inventory to load vars for managed-node2 46400 1727204562.32512: Calling groups_inventory to load vars for managed-node2 46400 1727204562.32514: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.32519: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.32521: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.32524: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.34122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.35747: done with get_vars() 46400 1727204562.35772: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.104) 0:00:52.642 ***** 46400 1727204562.35835: entering _queue_task() for managed-node2/include_tasks 46400 1727204562.36093: worker is 1 (out of 1 available) 46400 1727204562.36108: exiting _queue_task() for managed-node2/include_tasks 46400 1727204562.36121: done queuing things up, now waiting for results queue to drain 46400 1727204562.36123: waiting for pending results... 46400 1727204562.36363: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204562.36523: in run() - task 0affcd87-79f5-1303-fda8-00000000127a 46400 1727204562.36542: variable 'ansible_search_path' from source: unknown 46400 1727204562.36548: variable 'ansible_search_path' from source: unknown 46400 1727204562.36598: calling self._execute() 46400 1727204562.36722: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.36734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.36749: variable 'omit' from source: magic vars 46400 1727204562.37197: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.37214: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204562.37226: _execute() done 46400 1727204562.37235: dumping result to json 46400 1727204562.37250: done dumping result, returning 46400 1727204562.37262: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-00000000127a] 46400 1727204562.37277: sending task result for task 0affcd87-79f5-1303-fda8-00000000127a 46400 1727204562.37444: no more pending results, returning what we have 46400 1727204562.37451: in VariableManager get_vars() 46400 1727204562.37508: Calling all_inventory to load vars for managed-node2 46400 1727204562.37512: Calling groups_inventory to load vars for managed-node2 46400 1727204562.37515: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.37530: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.37535: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.37539: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.38246: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127a 46400 1727204562.38251: WORKER PROCESS EXITING 46400 1727204562.39529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.43039: done with get_vars() 46400 1727204562.43070: variable 'ansible_search_path' from source: unknown 46400 1727204562.43072: variable 'ansible_search_path' from source: unknown 46400 1727204562.43189: we have included files to process 46400 1727204562.43191: generating all_blocks data 46400 1727204562.43193: done generating all_blocks data 46400 1727204562.43197: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204562.43198: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204562.43200: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204562.45356: done processing included file 46400 1727204562.45359: iterating over new_blocks loaded from include file 46400 1727204562.45360: in VariableManager get_vars() 46400 1727204562.45600: done with get_vars() 46400 1727204562.45603: filtering new block on tags 46400 1727204562.45695: done filtering new block on tags 46400 1727204562.45699: in VariableManager get_vars() 46400 1727204562.45918: done with get_vars() 46400 1727204562.45988: filtering new block on tags 46400 1727204562.46207: done filtering new block on tags 46400 1727204562.46211: in VariableManager get_vars() 46400 1727204562.46256: done with get_vars() 46400 1727204562.46258: filtering new block on tags 46400 1727204562.46476: done filtering new block on tags 46400 1727204562.46487: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204562.46493: extending task lists for all hosts with included blocks 46400 1727204562.52069: done extending task lists 46400 1727204562.52071: done processing included files 46400 1727204562.52072: results queue empty 46400 1727204562.52073: checking for any_errors_fatal 46400 1727204562.52076: done checking for any_errors_fatal 46400 1727204562.52077: checking for max_fail_percentage 46400 1727204562.52078: done checking for max_fail_percentage 46400 1727204562.52079: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.52080: done checking to see if all hosts have failed 46400 1727204562.52081: getting the remaining hosts for this loop 46400 1727204562.52082: done getting the remaining hosts for this loop 46400 1727204562.52085: getting the next task for host managed-node2 46400 1727204562.52090: done getting next task for host managed-node2 46400 1727204562.52093: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204562.52098: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.52110: getting variables 46400 1727204562.52111: in VariableManager get_vars() 46400 1727204562.52129: Calling all_inventory to load vars for managed-node2 46400 1727204562.52131: Calling groups_inventory to load vars for managed-node2 46400 1727204562.52133: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.52139: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.52142: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.52144: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.54840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.57957: done with get_vars() 46400 1727204562.58501: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.227) 0:00:52.870 ***** 46400 1727204562.58597: entering _queue_task() for managed-node2/setup 46400 1727204562.58963: worker is 1 (out of 1 available) 46400 1727204562.59828: exiting _queue_task() for managed-node2/setup 46400 1727204562.59840: done queuing things up, now waiting for results queue to drain 46400 1727204562.59842: waiting for pending results... 46400 1727204562.59971: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204562.60336: in run() - task 0affcd87-79f5-1303-fda8-0000000012d1 46400 1727204562.60414: variable 'ansible_search_path' from source: unknown 46400 1727204562.60418: variable 'ansible_search_path' from source: unknown 46400 1727204562.60468: calling self._execute() 46400 1727204562.60718: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.60722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.60776: variable 'omit' from source: magic vars 46400 1727204562.61703: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.62519: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204562.63927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204562.70467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204562.70649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204562.70695: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204562.70730: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204562.70871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204562.70954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204562.71098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204562.71123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204562.71166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204562.71289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204562.71340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204562.71366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204562.71474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204562.71519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204562.71534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204562.71938: variable '__network_required_facts' from source: role '' defaults 46400 1727204562.71947: variable 'ansible_facts' from source: unknown 46400 1727204562.73530: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204562.73534: when evaluation is False, skipping this task 46400 1727204562.73537: _execute() done 46400 1727204562.73539: dumping result to json 46400 1727204562.73542: done dumping result, returning 46400 1727204562.73545: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-0000000012d1] 46400 1727204562.73552: sending task result for task 0affcd87-79f5-1303-fda8-0000000012d1 46400 1727204562.73868: done sending task result for task 0affcd87-79f5-1303-fda8-0000000012d1 46400 1727204562.73873: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204562.73920: no more pending results, returning what we have 46400 1727204562.73925: results queue empty 46400 1727204562.73927: checking for any_errors_fatal 46400 1727204562.73929: done checking for any_errors_fatal 46400 1727204562.73929: checking for max_fail_percentage 46400 1727204562.73931: done checking for max_fail_percentage 46400 1727204562.73933: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.73933: done checking to see if all hosts have failed 46400 1727204562.73934: getting the remaining hosts for this loop 46400 1727204562.73936: done getting the remaining hosts for this loop 46400 1727204562.73941: getting the next task for host managed-node2 46400 1727204562.73954: done getting next task for host managed-node2 46400 1727204562.73962: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204562.73972: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.73999: getting variables 46400 1727204562.74001: in VariableManager get_vars() 46400 1727204562.74042: Calling all_inventory to load vars for managed-node2 46400 1727204562.74045: Calling groups_inventory to load vars for managed-node2 46400 1727204562.74047: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.74058: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.74062: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.74073: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.78431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.83330: done with get_vars() 46400 1727204562.83435: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.250) 0:00:53.121 ***** 46400 1727204562.83672: entering _queue_task() for managed-node2/stat 46400 1727204562.84483: worker is 1 (out of 1 available) 46400 1727204562.84502: exiting _queue_task() for managed-node2/stat 46400 1727204562.84516: done queuing things up, now waiting for results queue to drain 46400 1727204562.84518: waiting for pending results... 46400 1727204562.85199: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204562.85572: in run() - task 0affcd87-79f5-1303-fda8-0000000012d3 46400 1727204562.85587: variable 'ansible_search_path' from source: unknown 46400 1727204562.85591: variable 'ansible_search_path' from source: unknown 46400 1727204562.85627: calling self._execute() 46400 1727204562.85838: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.85842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.85852: variable 'omit' from source: magic vars 46400 1727204562.86706: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.86718: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204562.87121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204562.87742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204562.87788: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204562.87937: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204562.87969: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204562.88274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204562.88277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204562.88280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204562.88394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204562.89426: variable '__network_is_ostree' from source: set_fact 46400 1727204562.89586: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204562.89597: when evaluation is False, skipping this task 46400 1727204562.89604: _execute() done 46400 1727204562.89611: dumping result to json 46400 1727204562.89618: done dumping result, returning 46400 1727204562.89630: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-0000000012d3] 46400 1727204562.89643: sending task result for task 0affcd87-79f5-1303-fda8-0000000012d3 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204562.89836: no more pending results, returning what we have 46400 1727204562.89841: results queue empty 46400 1727204562.89842: checking for any_errors_fatal 46400 1727204562.89853: done checking for any_errors_fatal 46400 1727204562.89853: checking for max_fail_percentage 46400 1727204562.89855: done checking for max_fail_percentage 46400 1727204562.89857: checking to see if all hosts have failed and the running result is not ok 46400 1727204562.89857: done checking to see if all hosts have failed 46400 1727204562.89858: getting the remaining hosts for this loop 46400 1727204562.89862: done getting the remaining hosts for this loop 46400 1727204562.89870: getting the next task for host managed-node2 46400 1727204562.89882: done getting next task for host managed-node2 46400 1727204562.89887: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204562.89894: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204562.89921: getting variables 46400 1727204562.89923: in VariableManager get_vars() 46400 1727204562.90059: Calling all_inventory to load vars for managed-node2 46400 1727204562.90066: Calling groups_inventory to load vars for managed-node2 46400 1727204562.90069: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204562.90081: Calling all_plugins_play to load vars for managed-node2 46400 1727204562.90083: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204562.90086: Calling groups_plugins_play to load vars for managed-node2 46400 1727204562.91255: done sending task result for task 0affcd87-79f5-1303-fda8-0000000012d3 46400 1727204562.91262: WORKER PROCESS EXITING 46400 1727204562.93158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204562.97779: done with get_vars() 46400 1727204562.97824: done getting variables 46400 1727204562.97901: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:42 -0400 (0:00:00.142) 0:00:53.264 ***** 46400 1727204562.97948: entering _queue_task() for managed-node2/set_fact 46400 1727204562.98580: worker is 1 (out of 1 available) 46400 1727204562.98597: exiting _queue_task() for managed-node2/set_fact 46400 1727204562.98610: done queuing things up, now waiting for results queue to drain 46400 1727204562.98612: waiting for pending results... 46400 1727204562.98926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204562.99373: in run() - task 0affcd87-79f5-1303-fda8-0000000012d4 46400 1727204562.99388: variable 'ansible_search_path' from source: unknown 46400 1727204562.99392: variable 'ansible_search_path' from source: unknown 46400 1727204562.99428: calling self._execute() 46400 1727204562.99530: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204562.99536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204562.99544: variable 'omit' from source: magic vars 46400 1727204562.99943: variable 'ansible_distribution_major_version' from source: facts 46400 1727204562.99955: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204563.00150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204563.00489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204563.00535: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204563.00594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204563.00626: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204563.00731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204563.00786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204563.00833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204563.00858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204563.00959: variable '__network_is_ostree' from source: set_fact 46400 1727204563.00966: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204563.00969: when evaluation is False, skipping this task 46400 1727204563.00972: _execute() done 46400 1727204563.00974: dumping result to json 46400 1727204563.00977: done dumping result, returning 46400 1727204563.00987: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-0000000012d4] 46400 1727204563.00994: sending task result for task 0affcd87-79f5-1303-fda8-0000000012d4 46400 1727204563.01101: done sending task result for task 0affcd87-79f5-1303-fda8-0000000012d4 46400 1727204563.01106: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204563.01159: no more pending results, returning what we have 46400 1727204563.01166: results queue empty 46400 1727204563.01168: checking for any_errors_fatal 46400 1727204563.01177: done checking for any_errors_fatal 46400 1727204563.01178: checking for max_fail_percentage 46400 1727204563.01180: done checking for max_fail_percentage 46400 1727204563.01182: checking to see if all hosts have failed and the running result is not ok 46400 1727204563.01183: done checking to see if all hosts have failed 46400 1727204563.01183: getting the remaining hosts for this loop 46400 1727204563.01186: done getting the remaining hosts for this loop 46400 1727204563.01190: getting the next task for host managed-node2 46400 1727204563.01204: done getting next task for host managed-node2 46400 1727204563.01208: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204563.01215: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204563.01244: getting variables 46400 1727204563.01247: in VariableManager get_vars() 46400 1727204563.01295: Calling all_inventory to load vars for managed-node2 46400 1727204563.01299: Calling groups_inventory to load vars for managed-node2 46400 1727204563.01302: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204563.01313: Calling all_plugins_play to load vars for managed-node2 46400 1727204563.01316: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204563.01319: Calling groups_plugins_play to load vars for managed-node2 46400 1727204563.04510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204563.08068: done with get_vars() 46400 1727204563.08107: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:43 -0400 (0:00:00.103) 0:00:53.367 ***** 46400 1727204563.08330: entering _queue_task() for managed-node2/service_facts 46400 1727204563.09136: worker is 1 (out of 1 available) 46400 1727204563.09151: exiting _queue_task() for managed-node2/service_facts 46400 1727204563.09166: done queuing things up, now waiting for results queue to drain 46400 1727204563.09168: waiting for pending results... 46400 1727204563.09992: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204563.10417: in run() - task 0affcd87-79f5-1303-fda8-0000000012d6 46400 1727204563.10448: variable 'ansible_search_path' from source: unknown 46400 1727204563.10510: variable 'ansible_search_path' from source: unknown 46400 1727204563.10569: calling self._execute() 46400 1727204563.10762: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204563.10896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204563.10932: variable 'omit' from source: magic vars 46400 1727204563.11724: variable 'ansible_distribution_major_version' from source: facts 46400 1727204563.11848: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204563.11984: variable 'omit' from source: magic vars 46400 1727204563.12072: variable 'omit' from source: magic vars 46400 1727204563.12340: variable 'omit' from source: magic vars 46400 1727204563.12521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204563.12620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204563.12719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204563.12763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204563.12868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204563.12912: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204563.12921: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204563.12969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204563.13213: Set connection var ansible_shell_type to sh 46400 1727204563.13255: Set connection var ansible_shell_executable to /bin/sh 46400 1727204563.13294: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204563.13305: Set connection var ansible_connection to ssh 46400 1727204563.13338: Set connection var ansible_pipelining to False 46400 1727204563.13348: Set connection var ansible_timeout to 10 46400 1727204563.13428: variable 'ansible_shell_executable' from source: unknown 46400 1727204563.13442: variable 'ansible_connection' from source: unknown 46400 1727204563.13450: variable 'ansible_module_compression' from source: unknown 46400 1727204563.13514: variable 'ansible_shell_type' from source: unknown 46400 1727204563.13521: variable 'ansible_shell_executable' from source: unknown 46400 1727204563.13527: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204563.13534: variable 'ansible_pipelining' from source: unknown 46400 1727204563.13542: variable 'ansible_timeout' from source: unknown 46400 1727204563.13553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204563.13798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204563.13813: variable 'omit' from source: magic vars 46400 1727204563.13821: starting attempt loop 46400 1727204563.13827: running the handler 46400 1727204563.13849: _low_level_execute_command(): starting 46400 1727204563.13865: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204563.15929: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204563.15949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.15979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.16000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.16045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.16063: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204563.16085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.16104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204563.16115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204563.16126: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204563.16139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.16153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.16178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.16198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.16212: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204563.16228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.16341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204563.16369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204563.16387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204563.16471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204563.18138: stdout chunk (state=3): >>>/root <<< 46400 1727204563.18328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204563.18332: stdout chunk (state=3): >>><<< 46400 1727204563.18334: stderr chunk (state=3): >>><<< 46400 1727204563.18452: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204563.18455: _low_level_execute_command(): starting 46400 1727204563.18458: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130 `" && echo ansible-tmp-1727204563.1835592-49985-78703364499130="` echo /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130 `" ) && sleep 0' 46400 1727204563.19737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204563.20186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.20205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.20225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.20267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.20281: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204563.20297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.20315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204563.20328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204563.20340: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204563.20353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.20370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.20386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.20399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.20411: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204563.20426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.20501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204563.20525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204563.20542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204563.20618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204563.22527: stdout chunk (state=3): >>>ansible-tmp-1727204563.1835592-49985-78703364499130=/root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130 <<< 46400 1727204563.22684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204563.22755: stderr chunk (state=3): >>><<< 46400 1727204563.22759: stdout chunk (state=3): >>><<< 46400 1727204563.22872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204563.1835592-49985-78703364499130=/root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204563.22875: variable 'ansible_module_compression' from source: unknown 46400 1727204563.22958: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204563.22961: variable 'ansible_facts' from source: unknown 46400 1727204563.23039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/AnsiballZ_service_facts.py 46400 1727204563.24428: Sending initial data 46400 1727204563.24432: Sent initial data (161 bytes) 46400 1727204563.26586: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204563.26786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.26803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.26822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.26867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.26987: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204563.27002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.27019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204563.27030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204563.27040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204563.27052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.27069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.27087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.27099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.27109: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204563.27122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.27400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204563.27424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204563.27441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204563.27511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204563.29342: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204563.29403: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204563.29437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpuqbqvkjj /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/AnsiballZ_service_facts.py <<< 46400 1727204563.29749: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204563.30949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204563.31086: stderr chunk (state=3): >>><<< 46400 1727204563.31090: stdout chunk (state=3): >>><<< 46400 1727204563.31092: done transferring module to remote 46400 1727204563.31099: _low_level_execute_command(): starting 46400 1727204563.31101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/ /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/AnsiballZ_service_facts.py && sleep 0' 46400 1727204563.32445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204563.33087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.33106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.33129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.33182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.33196: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204563.33211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.33230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204563.33244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204563.33256: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204563.33282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.33296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.33312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.33323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.33333: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204563.33345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.33427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204563.33454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204563.33477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204563.33550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204563.35411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204563.35415: stdout chunk (state=3): >>><<< 46400 1727204563.35417: stderr chunk (state=3): >>><<< 46400 1727204563.35470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204563.35474: _low_level_execute_command(): starting 46400 1727204563.35476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/AnsiballZ_service_facts.py && sleep 0' 46400 1727204563.37216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204563.37235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.37253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.37278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.37322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.37336: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204563.37351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.37375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204563.37388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204563.37402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204563.37414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204563.37430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204563.37447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204563.37462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204563.37477: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204563.37491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204563.37571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204563.38291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204563.38310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204563.38394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.67074: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped"<<< 46400 1727204564.67092: stdout chunk (state=3): >>>, "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "st<<< 46400 1727204564.67101: stdout chunk (state=3): >>>atus": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autof<<< 46400 1727204564.67136: stdout chunk (state=3): >>>s.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204564.68367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204564.68426: stderr chunk (state=3): >>><<< 46400 1727204564.68429: stdout chunk (state=3): >>><<< 46400 1727204564.68457: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204564.69036: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204564.69045: _low_level_execute_command(): starting 46400 1727204564.69050: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204563.1835592-49985-78703364499130/ > /dev/null 2>&1 && sleep 0' 46400 1727204564.69660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.69679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.69683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.69696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.69735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.69742: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.69750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.69775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.69779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.69784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.69792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.69813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.69816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.69819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.69824: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.69844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.69909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.69926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.69932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.70004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.71753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204564.71809: stderr chunk (state=3): >>><<< 46400 1727204564.71812: stdout chunk (state=3): >>><<< 46400 1727204564.71828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204564.71867: handler run complete 46400 1727204564.72173: variable 'ansible_facts' from source: unknown 46400 1727204564.72176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204564.72612: variable 'ansible_facts' from source: unknown 46400 1727204564.72732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204564.72919: attempt loop complete, returning result 46400 1727204564.72930: _execute() done 46400 1727204564.72933: dumping result to json 46400 1727204564.72993: done dumping result, returning 46400 1727204564.73002: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-0000000012d6] 46400 1727204564.73007: sending task result for task 0affcd87-79f5-1303-fda8-0000000012d6 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204564.73762: done sending task result for task 0affcd87-79f5-1303-fda8-0000000012d6 46400 1727204564.73782: no more pending results, returning what we have 46400 1727204564.73786: results queue empty 46400 1727204564.73787: checking for any_errors_fatal 46400 1727204564.73793: done checking for any_errors_fatal 46400 1727204564.73793: checking for max_fail_percentage 46400 1727204564.73795: done checking for max_fail_percentage 46400 1727204564.73796: checking to see if all hosts have failed and the running result is not ok 46400 1727204564.73797: done checking to see if all hosts have failed 46400 1727204564.73798: getting the remaining hosts for this loop 46400 1727204564.73799: done getting the remaining hosts for this loop 46400 1727204564.73803: getting the next task for host managed-node2 46400 1727204564.73811: done getting next task for host managed-node2 46400 1727204564.73814: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204564.73820: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204564.73826: WORKER PROCESS EXITING 46400 1727204564.73846: getting variables 46400 1727204564.73851: in VariableManager get_vars() 46400 1727204564.73890: Calling all_inventory to load vars for managed-node2 46400 1727204564.73893: Calling groups_inventory to load vars for managed-node2 46400 1727204564.73895: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204564.73905: Calling all_plugins_play to load vars for managed-node2 46400 1727204564.73908: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204564.73910: Calling groups_plugins_play to load vars for managed-node2 46400 1727204564.75545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204564.77330: done with get_vars() 46400 1727204564.77357: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:44 -0400 (0:00:01.691) 0:00:55.059 ***** 46400 1727204564.77467: entering _queue_task() for managed-node2/package_facts 46400 1727204564.77825: worker is 1 (out of 1 available) 46400 1727204564.77838: exiting _queue_task() for managed-node2/package_facts 46400 1727204564.77852: done queuing things up, now waiting for results queue to drain 46400 1727204564.77854: waiting for pending results... 46400 1727204564.78167: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204564.78327: in run() - task 0affcd87-79f5-1303-fda8-0000000012d7 46400 1727204564.78339: variable 'ansible_search_path' from source: unknown 46400 1727204564.78343: variable 'ansible_search_path' from source: unknown 46400 1727204564.78385: calling self._execute() 46400 1727204564.78484: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204564.78489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204564.78500: variable 'omit' from source: magic vars 46400 1727204564.78895: variable 'ansible_distribution_major_version' from source: facts 46400 1727204564.78911: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204564.78917: variable 'omit' from source: magic vars 46400 1727204564.79009: variable 'omit' from source: magic vars 46400 1727204564.79044: variable 'omit' from source: magic vars 46400 1727204564.79095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204564.79135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204564.79155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204564.79183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204564.79194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204564.79224: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204564.79231: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204564.79234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204564.79335: Set connection var ansible_shell_type to sh 46400 1727204564.79349: Set connection var ansible_shell_executable to /bin/sh 46400 1727204564.79354: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204564.79359: Set connection var ansible_connection to ssh 46400 1727204564.79369: Set connection var ansible_pipelining to False 46400 1727204564.79376: Set connection var ansible_timeout to 10 46400 1727204564.79406: variable 'ansible_shell_executable' from source: unknown 46400 1727204564.79409: variable 'ansible_connection' from source: unknown 46400 1727204564.79412: variable 'ansible_module_compression' from source: unknown 46400 1727204564.79414: variable 'ansible_shell_type' from source: unknown 46400 1727204564.79417: variable 'ansible_shell_executable' from source: unknown 46400 1727204564.79419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204564.79423: variable 'ansible_pipelining' from source: unknown 46400 1727204564.79425: variable 'ansible_timeout' from source: unknown 46400 1727204564.79430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204564.79643: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204564.79652: variable 'omit' from source: magic vars 46400 1727204564.79655: starting attempt loop 46400 1727204564.79658: running the handler 46400 1727204564.79684: _low_level_execute_command(): starting 46400 1727204564.79691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204564.80489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.80504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.80516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.80532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.80588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.80591: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.80601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.80615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.80624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.80631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.80638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.80654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.80671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.80679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.80686: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.80702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.80780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.80805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.80818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.80890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.82470: stdout chunk (state=3): >>>/root <<< 46400 1727204564.82769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204564.82772: stdout chunk (state=3): >>><<< 46400 1727204564.82774: stderr chunk (state=3): >>><<< 46400 1727204564.82779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204564.82781: _low_level_execute_command(): starting 46400 1727204564.82784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635 `" && echo ansible-tmp-1727204564.8267472-50038-153527847585635="` echo /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635 `" ) && sleep 0' 46400 1727204564.83434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.83444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.83454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.83471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.83510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.83516: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.83526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.83539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.83546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.83552: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.83563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.83579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.83590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.83598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.83605: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.83614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.83690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.83704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.83713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.83789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.85617: stdout chunk (state=3): >>>ansible-tmp-1727204564.8267472-50038-153527847585635=/root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635 <<< 46400 1727204564.85770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204564.85820: stderr chunk (state=3): >>><<< 46400 1727204564.85823: stdout chunk (state=3): >>><<< 46400 1727204564.85871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204564.8267472-50038-153527847585635=/root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204564.86072: variable 'ansible_module_compression' from source: unknown 46400 1727204564.86076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204564.86079: variable 'ansible_facts' from source: unknown 46400 1727204564.86202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/AnsiballZ_package_facts.py 46400 1727204564.86371: Sending initial data 46400 1727204564.86374: Sent initial data (162 bytes) 46400 1727204564.87328: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.87342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.87357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.87380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.87423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.87436: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.87451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.87473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.87486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.87498: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.87512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.87525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.87541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.87553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.87569: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.87584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.87659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.87679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.87695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.87793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.89497: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204564.89533: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204564.89577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpuafsm5w_ /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/AnsiballZ_package_facts.py <<< 46400 1727204564.89621: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204564.92195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204564.92278: stderr chunk (state=3): >>><<< 46400 1727204564.92282: stdout chunk (state=3): >>><<< 46400 1727204564.92305: done transferring module to remote 46400 1727204564.92316: _low_level_execute_command(): starting 46400 1727204564.92321: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/ /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/AnsiballZ_package_facts.py && sleep 0' 46400 1727204564.92966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.92978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.92991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.93006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.93046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.93053: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.93066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.93079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.93085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.93092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.93099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.93109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.93123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.93131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.93138: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.93148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.93220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.93234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.93245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.93318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204564.95019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204564.95108: stderr chunk (state=3): >>><<< 46400 1727204564.95111: stdout chunk (state=3): >>><<< 46400 1727204564.95129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204564.95132: _low_level_execute_command(): starting 46400 1727204564.95139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/AnsiballZ_package_facts.py && sleep 0' 46400 1727204564.95774: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204564.95784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.95794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.95808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.95846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.95853: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204564.95866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.95878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204564.95885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204564.95892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204564.95900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204564.95911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204564.95919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204564.95927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204564.95933: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204564.95942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204564.96030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204564.96034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204564.96042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204564.96119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204565.42263: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204565.42342: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204565.42412: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 46400 1727204565.42422: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204565.42426: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 46400 1727204565.42430: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204565.42446: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204565.42474: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 46400 1727204565.42478: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204565.42483: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204565.42485: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204565.42489: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204565.42491: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204565.42494: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204565.42498: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204565.42501: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204565.44043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204565.44072: stderr chunk (state=3): >>><<< 46400 1727204565.44076: stdout chunk (state=3): >>><<< 46400 1727204565.44177: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204565.46562: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204565.46591: _low_level_execute_command(): starting 46400 1727204565.46602: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204564.8267472-50038-153527847585635/ > /dev/null 2>&1 && sleep 0' 46400 1727204565.47221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204565.47249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204565.47252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204565.47291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204565.47305: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204565.47315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204565.47328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204565.47338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204565.47395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204565.47399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204565.47454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204565.49291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204565.49337: stderr chunk (state=3): >>><<< 46400 1727204565.49340: stdout chunk (state=3): >>><<< 46400 1727204565.49369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204565.49372: handler run complete 46400 1727204565.50294: variable 'ansible_facts' from source: unknown 46400 1727204565.50592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.51804: variable 'ansible_facts' from source: unknown 46400 1727204565.52494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.54308: attempt loop complete, returning result 46400 1727204565.54331: _execute() done 46400 1727204565.54344: dumping result to json 46400 1727204565.54610: done dumping result, returning 46400 1727204565.54618: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-0000000012d7] 46400 1727204565.54624: sending task result for task 0affcd87-79f5-1303-fda8-0000000012d7 46400 1727204565.56560: done sending task result for task 0affcd87-79f5-1303-fda8-0000000012d7 46400 1727204565.56566: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204565.56707: no more pending results, returning what we have 46400 1727204565.56710: results queue empty 46400 1727204565.56710: checking for any_errors_fatal 46400 1727204565.56716: done checking for any_errors_fatal 46400 1727204565.56717: checking for max_fail_percentage 46400 1727204565.56719: done checking for max_fail_percentage 46400 1727204565.56719: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.56721: done checking to see if all hosts have failed 46400 1727204565.56721: getting the remaining hosts for this loop 46400 1727204565.56723: done getting the remaining hosts for this loop 46400 1727204565.56726: getting the next task for host managed-node2 46400 1727204565.56733: done getting next task for host managed-node2 46400 1727204565.56737: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204565.56742: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.56753: getting variables 46400 1727204565.56754: in VariableManager get_vars() 46400 1727204565.56785: Calling all_inventory to load vars for managed-node2 46400 1727204565.56788: Calling groups_inventory to load vars for managed-node2 46400 1727204565.56790: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.56803: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.56807: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.56811: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.58323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.60876: done with get_vars() 46400 1727204565.60907: done getting variables 46400 1727204565.60985: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:45 -0400 (0:00:00.835) 0:00:55.894 ***** 46400 1727204565.61027: entering _queue_task() for managed-node2/debug 46400 1727204565.61290: worker is 1 (out of 1 available) 46400 1727204565.61305: exiting _queue_task() for managed-node2/debug 46400 1727204565.61317: done queuing things up, now waiting for results queue to drain 46400 1727204565.61319: waiting for pending results... 46400 1727204565.61511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204565.61611: in run() - task 0affcd87-79f5-1303-fda8-00000000127b 46400 1727204565.61623: variable 'ansible_search_path' from source: unknown 46400 1727204565.61627: variable 'ansible_search_path' from source: unknown 46400 1727204565.61662: calling self._execute() 46400 1727204565.61735: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.61739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.61753: variable 'omit' from source: magic vars 46400 1727204565.62135: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.62153: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204565.62170: variable 'omit' from source: magic vars 46400 1727204565.62255: variable 'omit' from source: magic vars 46400 1727204565.62378: variable 'network_provider' from source: set_fact 46400 1727204565.62404: variable 'omit' from source: magic vars 46400 1727204565.62465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204565.62505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204565.62544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204565.62572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204565.62588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204565.62624: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204565.62656: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.62676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.62906: Set connection var ansible_shell_type to sh 46400 1727204565.62920: Set connection var ansible_shell_executable to /bin/sh 46400 1727204565.62923: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204565.62930: Set connection var ansible_connection to ssh 46400 1727204565.62942: Set connection var ansible_pipelining to False 46400 1727204565.62946: Set connection var ansible_timeout to 10 46400 1727204565.62969: variable 'ansible_shell_executable' from source: unknown 46400 1727204565.63004: variable 'ansible_connection' from source: unknown 46400 1727204565.63008: variable 'ansible_module_compression' from source: unknown 46400 1727204565.63011: variable 'ansible_shell_type' from source: unknown 46400 1727204565.63013: variable 'ansible_shell_executable' from source: unknown 46400 1727204565.63015: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.63017: variable 'ansible_pipelining' from source: unknown 46400 1727204565.63019: variable 'ansible_timeout' from source: unknown 46400 1727204565.63021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.63110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204565.63118: variable 'omit' from source: magic vars 46400 1727204565.63123: starting attempt loop 46400 1727204565.63127: running the handler 46400 1727204565.63171: handler run complete 46400 1727204565.63181: attempt loop complete, returning result 46400 1727204565.63183: _execute() done 46400 1727204565.63186: dumping result to json 46400 1727204565.63189: done dumping result, returning 46400 1727204565.63194: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-00000000127b] 46400 1727204565.63199: sending task result for task 0affcd87-79f5-1303-fda8-00000000127b 46400 1727204565.63284: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127b 46400 1727204565.63287: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204565.63349: no more pending results, returning what we have 46400 1727204565.63353: results queue empty 46400 1727204565.63354: checking for any_errors_fatal 46400 1727204565.63373: done checking for any_errors_fatal 46400 1727204565.63374: checking for max_fail_percentage 46400 1727204565.63376: done checking for max_fail_percentage 46400 1727204565.63377: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.63378: done checking to see if all hosts have failed 46400 1727204565.63379: getting the remaining hosts for this loop 46400 1727204565.63381: done getting the remaining hosts for this loop 46400 1727204565.63385: getting the next task for host managed-node2 46400 1727204565.63393: done getting next task for host managed-node2 46400 1727204565.63397: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204565.63402: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.63413: getting variables 46400 1727204565.63415: in VariableManager get_vars() 46400 1727204565.63447: Calling all_inventory to load vars for managed-node2 46400 1727204565.63449: Calling groups_inventory to load vars for managed-node2 46400 1727204565.63451: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.63463: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.63467: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.63470: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.64624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.66280: done with get_vars() 46400 1727204565.66311: done getting variables 46400 1727204565.66380: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:45 -0400 (0:00:00.053) 0:00:55.948 ***** 46400 1727204565.66424: entering _queue_task() for managed-node2/fail 46400 1727204565.66730: worker is 1 (out of 1 available) 46400 1727204565.66743: exiting _queue_task() for managed-node2/fail 46400 1727204565.66757: done queuing things up, now waiting for results queue to drain 46400 1727204565.66762: waiting for pending results... 46400 1727204565.66954: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204565.67070: in run() - task 0affcd87-79f5-1303-fda8-00000000127c 46400 1727204565.67081: variable 'ansible_search_path' from source: unknown 46400 1727204565.67085: variable 'ansible_search_path' from source: unknown 46400 1727204565.67118: calling self._execute() 46400 1727204565.67200: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.67204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.67212: variable 'omit' from source: magic vars 46400 1727204565.67502: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.67516: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204565.67599: variable 'network_state' from source: role '' defaults 46400 1727204565.67608: Evaluated conditional (network_state != {}): False 46400 1727204565.67611: when evaluation is False, skipping this task 46400 1727204565.67614: _execute() done 46400 1727204565.67618: dumping result to json 46400 1727204565.67621: done dumping result, returning 46400 1727204565.67624: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-00000000127c] 46400 1727204565.67632: sending task result for task 0affcd87-79f5-1303-fda8-00000000127c 46400 1727204565.67720: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127c 46400 1727204565.67723: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204565.67792: no more pending results, returning what we have 46400 1727204565.67796: results queue empty 46400 1727204565.67797: checking for any_errors_fatal 46400 1727204565.67803: done checking for any_errors_fatal 46400 1727204565.67804: checking for max_fail_percentage 46400 1727204565.67805: done checking for max_fail_percentage 46400 1727204565.67806: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.67807: done checking to see if all hosts have failed 46400 1727204565.67808: getting the remaining hosts for this loop 46400 1727204565.67810: done getting the remaining hosts for this loop 46400 1727204565.67813: getting the next task for host managed-node2 46400 1727204565.67822: done getting next task for host managed-node2 46400 1727204565.67825: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204565.67830: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.67859: getting variables 46400 1727204565.67861: in VariableManager get_vars() 46400 1727204565.67896: Calling all_inventory to load vars for managed-node2 46400 1727204565.67899: Calling groups_inventory to load vars for managed-node2 46400 1727204565.67901: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.67909: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.67912: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.67914: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.68826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.70368: done with get_vars() 46400 1727204565.70386: done getting variables 46400 1727204565.70426: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:45 -0400 (0:00:00.040) 0:00:55.989 ***** 46400 1727204565.70452: entering _queue_task() for managed-node2/fail 46400 1727204565.70672: worker is 1 (out of 1 available) 46400 1727204565.70688: exiting _queue_task() for managed-node2/fail 46400 1727204565.70700: done queuing things up, now waiting for results queue to drain 46400 1727204565.70702: waiting for pending results... 46400 1727204565.70902: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204565.70994: in run() - task 0affcd87-79f5-1303-fda8-00000000127d 46400 1727204565.71009: variable 'ansible_search_path' from source: unknown 46400 1727204565.71015: variable 'ansible_search_path' from source: unknown 46400 1727204565.71044: calling self._execute() 46400 1727204565.71119: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.71126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.71135: variable 'omit' from source: magic vars 46400 1727204565.71407: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.71416: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204565.71506: variable 'network_state' from source: role '' defaults 46400 1727204565.71513: Evaluated conditional (network_state != {}): False 46400 1727204565.71516: when evaluation is False, skipping this task 46400 1727204565.71519: _execute() done 46400 1727204565.71522: dumping result to json 46400 1727204565.71524: done dumping result, returning 46400 1727204565.71531: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-00000000127d] 46400 1727204565.71541: sending task result for task 0affcd87-79f5-1303-fda8-00000000127d 46400 1727204565.71632: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127d 46400 1727204565.71635: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204565.71692: no more pending results, returning what we have 46400 1727204565.71696: results queue empty 46400 1727204565.71697: checking for any_errors_fatal 46400 1727204565.71702: done checking for any_errors_fatal 46400 1727204565.71703: checking for max_fail_percentage 46400 1727204565.71704: done checking for max_fail_percentage 46400 1727204565.71705: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.71706: done checking to see if all hosts have failed 46400 1727204565.71707: getting the remaining hosts for this loop 46400 1727204565.71708: done getting the remaining hosts for this loop 46400 1727204565.71712: getting the next task for host managed-node2 46400 1727204565.71719: done getting next task for host managed-node2 46400 1727204565.71723: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204565.71727: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.71751: getting variables 46400 1727204565.71753: in VariableManager get_vars() 46400 1727204565.71795: Calling all_inventory to load vars for managed-node2 46400 1727204565.71798: Calling groups_inventory to load vars for managed-node2 46400 1727204565.71800: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.71807: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.71808: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.71810: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.72597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.73957: done with get_vars() 46400 1727204565.73983: done getting variables 46400 1727204565.74042: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:45 -0400 (0:00:00.036) 0:00:56.025 ***** 46400 1727204565.74079: entering _queue_task() for managed-node2/fail 46400 1727204565.74389: worker is 1 (out of 1 available) 46400 1727204565.74403: exiting _queue_task() for managed-node2/fail 46400 1727204565.74416: done queuing things up, now waiting for results queue to drain 46400 1727204565.74418: waiting for pending results... 46400 1727204565.74723: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204565.74889: in run() - task 0affcd87-79f5-1303-fda8-00000000127e 46400 1727204565.74908: variable 'ansible_search_path' from source: unknown 46400 1727204565.74916: variable 'ansible_search_path' from source: unknown 46400 1727204565.74958: calling self._execute() 46400 1727204565.75059: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.75076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.75091: variable 'omit' from source: magic vars 46400 1727204565.75432: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.75449: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204565.75582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204565.77262: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204565.77321: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204565.77348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204565.77378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204565.77401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204565.77458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204565.77484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204565.77505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.77532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204565.77543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204565.77621: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.77630: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204565.77633: when evaluation is False, skipping this task 46400 1727204565.77636: _execute() done 46400 1727204565.77639: dumping result to json 46400 1727204565.77642: done dumping result, returning 46400 1727204565.77650: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-00000000127e] 46400 1727204565.77655: sending task result for task 0affcd87-79f5-1303-fda8-00000000127e 46400 1727204565.77747: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127e 46400 1727204565.77750: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204565.77800: no more pending results, returning what we have 46400 1727204565.77804: results queue empty 46400 1727204565.77805: checking for any_errors_fatal 46400 1727204565.77813: done checking for any_errors_fatal 46400 1727204565.77814: checking for max_fail_percentage 46400 1727204565.77816: done checking for max_fail_percentage 46400 1727204565.77817: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.77817: done checking to see if all hosts have failed 46400 1727204565.77818: getting the remaining hosts for this loop 46400 1727204565.77820: done getting the remaining hosts for this loop 46400 1727204565.77824: getting the next task for host managed-node2 46400 1727204565.77833: done getting next task for host managed-node2 46400 1727204565.77837: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204565.77842: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.77953: getting variables 46400 1727204565.77998: in VariableManager get_vars() 46400 1727204565.78079: Calling all_inventory to load vars for managed-node2 46400 1727204565.78082: Calling groups_inventory to load vars for managed-node2 46400 1727204565.78085: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.78095: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.78098: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.78101: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.79699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204565.81645: done with get_vars() 46400 1727204565.81672: done getting variables 46400 1727204565.81715: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:45 -0400 (0:00:00.076) 0:00:56.101 ***** 46400 1727204565.81741: entering _queue_task() for managed-node2/dnf 46400 1727204565.81991: worker is 1 (out of 1 available) 46400 1727204565.82005: exiting _queue_task() for managed-node2/dnf 46400 1727204565.82018: done queuing things up, now waiting for results queue to drain 46400 1727204565.82020: waiting for pending results... 46400 1727204565.82216: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204565.82307: in run() - task 0affcd87-79f5-1303-fda8-00000000127f 46400 1727204565.82317: variable 'ansible_search_path' from source: unknown 46400 1727204565.82321: variable 'ansible_search_path' from source: unknown 46400 1727204565.82352: calling self._execute() 46400 1727204565.82427: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204565.82433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204565.82441: variable 'omit' from source: magic vars 46400 1727204565.82723: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.82734: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204565.82882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204565.85615: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204565.85713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204565.85757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204565.85803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204565.85843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204565.85971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204565.86082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204565.86112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.86257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204565.86288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204565.86421: variable 'ansible_distribution' from source: facts 46400 1727204565.86431: variable 'ansible_distribution_major_version' from source: facts 46400 1727204565.86451: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204565.86601: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204565.86758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204565.86793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204565.86831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.86880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204565.86898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204565.86951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204565.86984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204565.87013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.87071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204565.87089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204565.87141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204565.87173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204565.87201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.87254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204565.87277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204565.87464: variable 'network_connections' from source: include params 46400 1727204565.87485: variable 'interface' from source: play vars 46400 1727204565.87554: variable 'interface' from source: play vars 46400 1727204565.87642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204565.87843: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204565.87892: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204565.87932: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204565.87967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204565.88043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204565.88076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204565.88127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204565.88158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204565.88218: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204565.88459: variable 'network_connections' from source: include params 46400 1727204565.88473: variable 'interface' from source: play vars 46400 1727204565.88549: variable 'interface' from source: play vars 46400 1727204565.88589: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204565.88598: when evaluation is False, skipping this task 46400 1727204565.88605: _execute() done 46400 1727204565.88611: dumping result to json 46400 1727204565.88617: done dumping result, returning 46400 1727204565.88628: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000127f] 46400 1727204565.88639: sending task result for task 0affcd87-79f5-1303-fda8-00000000127f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204565.88832: no more pending results, returning what we have 46400 1727204565.88837: results queue empty 46400 1727204565.88840: checking for any_errors_fatal 46400 1727204565.88847: done checking for any_errors_fatal 46400 1727204565.88848: checking for max_fail_percentage 46400 1727204565.88850: done checking for max_fail_percentage 46400 1727204565.88851: checking to see if all hosts have failed and the running result is not ok 46400 1727204565.88852: done checking to see if all hosts have failed 46400 1727204565.88853: getting the remaining hosts for this loop 46400 1727204565.88855: done getting the remaining hosts for this loop 46400 1727204565.88862: getting the next task for host managed-node2 46400 1727204565.88876: done getting next task for host managed-node2 46400 1727204565.88881: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204565.88887: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204565.88915: getting variables 46400 1727204565.88917: in VariableManager get_vars() 46400 1727204565.88959: Calling all_inventory to load vars for managed-node2 46400 1727204565.88967: Calling groups_inventory to load vars for managed-node2 46400 1727204565.88970: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204565.88982: Calling all_plugins_play to load vars for managed-node2 46400 1727204565.88985: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204565.88988: Calling groups_plugins_play to load vars for managed-node2 46400 1727204565.90103: done sending task result for task 0affcd87-79f5-1303-fda8-00000000127f 46400 1727204565.90107: WORKER PROCESS EXITING 46400 1727204565.92940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.02189: done with get_vars() 46400 1727204566.02224: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204566.02319: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.206) 0:00:56.308 ***** 46400 1727204566.02345: entering _queue_task() for managed-node2/yum 46400 1727204566.02874: worker is 1 (out of 1 available) 46400 1727204566.02888: exiting _queue_task() for managed-node2/yum 46400 1727204566.02914: done queuing things up, now waiting for results queue to drain 46400 1727204566.02916: waiting for pending results... 46400 1727204566.03293: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204566.03655: in run() - task 0affcd87-79f5-1303-fda8-000000001280 46400 1727204566.03694: variable 'ansible_search_path' from source: unknown 46400 1727204566.03708: variable 'ansible_search_path' from source: unknown 46400 1727204566.03756: calling self._execute() 46400 1727204566.03966: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.03978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.03992: variable 'omit' from source: magic vars 46400 1727204566.04436: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.04455: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.04665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204566.07562: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204566.07671: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204566.07721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204566.07773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204566.07813: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204566.07911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.07944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.07979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.08036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.08058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.08184: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.08229: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204566.08237: when evaluation is False, skipping this task 46400 1727204566.08244: _execute() done 46400 1727204566.08251: dumping result to json 46400 1727204566.08258: done dumping result, returning 46400 1727204566.08273: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001280] 46400 1727204566.08304: sending task result for task 0affcd87-79f5-1303-fda8-000000001280 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204566.08788: no more pending results, returning what we have 46400 1727204566.08792: results queue empty 46400 1727204566.08793: checking for any_errors_fatal 46400 1727204566.08803: done checking for any_errors_fatal 46400 1727204566.08804: checking for max_fail_percentage 46400 1727204566.08806: done checking for max_fail_percentage 46400 1727204566.08807: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.08808: done checking to see if all hosts have failed 46400 1727204566.08809: getting the remaining hosts for this loop 46400 1727204566.08811: done getting the remaining hosts for this loop 46400 1727204566.08815: getting the next task for host managed-node2 46400 1727204566.08825: done getting next task for host managed-node2 46400 1727204566.08830: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204566.08835: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.08866: getting variables 46400 1727204566.08868: in VariableManager get_vars() 46400 1727204566.08914: Calling all_inventory to load vars for managed-node2 46400 1727204566.08917: Calling groups_inventory to load vars for managed-node2 46400 1727204566.08919: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.08930: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.08933: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.08936: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.10333: done sending task result for task 0affcd87-79f5-1303-fda8-000000001280 46400 1727204566.10336: WORKER PROCESS EXITING 46400 1727204566.11792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.14002: done with get_vars() 46400 1727204566.14032: done getting variables 46400 1727204566.14104: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.117) 0:00:56.425 ***** 46400 1727204566.14142: entering _queue_task() for managed-node2/fail 46400 1727204566.15429: worker is 1 (out of 1 available) 46400 1727204566.15444: exiting _queue_task() for managed-node2/fail 46400 1727204566.15458: done queuing things up, now waiting for results queue to drain 46400 1727204566.15460: waiting for pending results... 46400 1727204566.16406: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204566.16587: in run() - task 0affcd87-79f5-1303-fda8-000000001281 46400 1727204566.16590: variable 'ansible_search_path' from source: unknown 46400 1727204566.16594: variable 'ansible_search_path' from source: unknown 46400 1727204566.16640: calling self._execute() 46400 1727204566.16746: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.16757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.16760: variable 'omit' from source: magic vars 46400 1727204566.17173: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.17245: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.17406: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.17872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204566.24309: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204566.24913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204566.24969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204566.25013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204566.25051: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204566.25234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.25883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.26079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.26198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.26224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.26737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.26768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.26800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.26856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.26879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.27004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.27033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.27063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.27119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.27393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.29969: variable 'network_connections' from source: include params 46400 1727204566.30211: variable 'interface' from source: play vars 46400 1727204566.30890: variable 'interface' from source: play vars 46400 1727204566.31438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204566.31922: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204566.31967: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204566.32100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204566.32136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204566.32268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204566.32297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204566.32335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.32370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204566.32422: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204566.32688: variable 'network_connections' from source: include params 46400 1727204566.32698: variable 'interface' from source: play vars 46400 1727204566.32774: variable 'interface' from source: play vars 46400 1727204566.32802: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204566.32809: when evaluation is False, skipping this task 46400 1727204566.32814: _execute() done 46400 1727204566.32819: dumping result to json 46400 1727204566.32824: done dumping result, returning 46400 1727204566.32833: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001281] 46400 1727204566.32843: sending task result for task 0affcd87-79f5-1303-fda8-000000001281 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204566.33018: no more pending results, returning what we have 46400 1727204566.33024: results queue empty 46400 1727204566.33025: checking for any_errors_fatal 46400 1727204566.33030: done checking for any_errors_fatal 46400 1727204566.33031: checking for max_fail_percentage 46400 1727204566.33033: done checking for max_fail_percentage 46400 1727204566.33034: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.33035: done checking to see if all hosts have failed 46400 1727204566.33036: getting the remaining hosts for this loop 46400 1727204566.33038: done getting the remaining hosts for this loop 46400 1727204566.33042: getting the next task for host managed-node2 46400 1727204566.33052: done getting next task for host managed-node2 46400 1727204566.33056: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204566.33061: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.33087: getting variables 46400 1727204566.33089: in VariableManager get_vars() 46400 1727204566.33131: Calling all_inventory to load vars for managed-node2 46400 1727204566.33134: Calling groups_inventory to load vars for managed-node2 46400 1727204566.33136: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.33147: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.33149: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.33152: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.34212: done sending task result for task 0affcd87-79f5-1303-fda8-000000001281 46400 1727204566.34216: WORKER PROCESS EXITING 46400 1727204566.35215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.37609: done with get_vars() 46400 1727204566.37639: done getting variables 46400 1727204566.37823: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.237) 0:00:56.663 ***** 46400 1727204566.37861: entering _queue_task() for managed-node2/package 46400 1727204566.38282: worker is 1 (out of 1 available) 46400 1727204566.38296: exiting _queue_task() for managed-node2/package 46400 1727204566.38309: done queuing things up, now waiting for results queue to drain 46400 1727204566.38311: waiting for pending results... 46400 1727204566.38651: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204566.38823: in run() - task 0affcd87-79f5-1303-fda8-000000001282 46400 1727204566.38844: variable 'ansible_search_path' from source: unknown 46400 1727204566.38853: variable 'ansible_search_path' from source: unknown 46400 1727204566.38901: calling self._execute() 46400 1727204566.39007: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.39020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.39034: variable 'omit' from source: magic vars 46400 1727204566.39431: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.39447: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.39657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204566.39938: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204566.39996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204566.40034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204566.40116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204566.40239: variable 'network_packages' from source: role '' defaults 46400 1727204566.40353: variable '__network_provider_setup' from source: role '' defaults 46400 1727204566.40370: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204566.40443: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204566.40457: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204566.40528: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204566.40727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204566.44160: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204566.44233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204566.44283: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204566.44322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204566.44359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204566.44474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.44508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.44554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.44627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.44646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.44702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.44730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.44757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.44809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.44827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.45073: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204566.45208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.45245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.45281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.45324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.45351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.45450: variable 'ansible_python' from source: facts 46400 1727204566.45477: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204566.45571: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204566.45658: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204566.45795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.45823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.45853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.45904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.45919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.45960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.46001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.46027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.46068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.46086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.46255: variable 'network_connections' from source: include params 46400 1727204566.46269: variable 'interface' from source: play vars 46400 1727204566.46384: variable 'interface' from source: play vars 46400 1727204566.46471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204566.46505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204566.46550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.46589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204566.46645: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.47001: variable 'network_connections' from source: include params 46400 1727204566.47010: variable 'interface' from source: play vars 46400 1727204566.47114: variable 'interface' from source: play vars 46400 1727204566.47151: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204566.47245: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.47708: variable 'network_connections' from source: include params 46400 1727204566.47845: variable 'interface' from source: play vars 46400 1727204566.47913: variable 'interface' from source: play vars 46400 1727204566.48081: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204566.48270: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204566.48873: variable 'network_connections' from source: include params 46400 1727204566.48884: variable 'interface' from source: play vars 46400 1727204566.49023: variable 'interface' from source: play vars 46400 1727204566.49090: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204566.49160: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204566.49174: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204566.49236: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204566.49486: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204566.50012: variable 'network_connections' from source: include params 46400 1727204566.50027: variable 'interface' from source: play vars 46400 1727204566.50087: variable 'interface' from source: play vars 46400 1727204566.50099: variable 'ansible_distribution' from source: facts 46400 1727204566.50107: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.50117: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.50143: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204566.50305: variable 'ansible_distribution' from source: facts 46400 1727204566.50314: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.50323: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.50340: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204566.50511: variable 'ansible_distribution' from source: facts 46400 1727204566.50520: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.50528: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.50572: variable 'network_provider' from source: set_fact 46400 1727204566.50596: variable 'ansible_facts' from source: unknown 46400 1727204566.51456: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204566.51465: when evaluation is False, skipping this task 46400 1727204566.51472: _execute() done 46400 1727204566.51479: dumping result to json 46400 1727204566.51485: done dumping result, returning 46400 1727204566.51496: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000001282] 46400 1727204566.51506: sending task result for task 0affcd87-79f5-1303-fda8-000000001282 46400 1727204566.51642: done sending task result for task 0affcd87-79f5-1303-fda8-000000001282 46400 1727204566.51655: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204566.51708: no more pending results, returning what we have 46400 1727204566.51713: results queue empty 46400 1727204566.51714: checking for any_errors_fatal 46400 1727204566.51723: done checking for any_errors_fatal 46400 1727204566.51724: checking for max_fail_percentage 46400 1727204566.51726: done checking for max_fail_percentage 46400 1727204566.51727: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.51728: done checking to see if all hosts have failed 46400 1727204566.51729: getting the remaining hosts for this loop 46400 1727204566.51731: done getting the remaining hosts for this loop 46400 1727204566.51736: getting the next task for host managed-node2 46400 1727204566.51746: done getting next task for host managed-node2 46400 1727204566.51753: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204566.51758: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.51786: getting variables 46400 1727204566.51788: in VariableManager get_vars() 46400 1727204566.51837: Calling all_inventory to load vars for managed-node2 46400 1727204566.51840: Calling groups_inventory to load vars for managed-node2 46400 1727204566.51843: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.51855: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.51858: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.51861: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.53544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.55393: done with get_vars() 46400 1727204566.55424: done getting variables 46400 1727204566.55495: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.176) 0:00:56.839 ***** 46400 1727204566.55531: entering _queue_task() for managed-node2/package 46400 1727204566.55883: worker is 1 (out of 1 available) 46400 1727204566.55896: exiting _queue_task() for managed-node2/package 46400 1727204566.55908: done queuing things up, now waiting for results queue to drain 46400 1727204566.55909: waiting for pending results... 46400 1727204566.56214: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204566.56383: in run() - task 0affcd87-79f5-1303-fda8-000000001283 46400 1727204566.56401: variable 'ansible_search_path' from source: unknown 46400 1727204566.56408: variable 'ansible_search_path' from source: unknown 46400 1727204566.56454: calling self._execute() 46400 1727204566.56557: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.56573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.56588: variable 'omit' from source: magic vars 46400 1727204566.56968: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.56989: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.57124: variable 'network_state' from source: role '' defaults 46400 1727204566.57140: Evaluated conditional (network_state != {}): False 46400 1727204566.57147: when evaluation is False, skipping this task 46400 1727204566.57154: _execute() done 46400 1727204566.57160: dumping result to json 46400 1727204566.57169: done dumping result, returning 46400 1727204566.57179: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001283] 46400 1727204566.57190: sending task result for task 0affcd87-79f5-1303-fda8-000000001283 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204566.57370: no more pending results, returning what we have 46400 1727204566.57375: results queue empty 46400 1727204566.57376: checking for any_errors_fatal 46400 1727204566.57386: done checking for any_errors_fatal 46400 1727204566.57387: checking for max_fail_percentage 46400 1727204566.57389: done checking for max_fail_percentage 46400 1727204566.57390: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.57391: done checking to see if all hosts have failed 46400 1727204566.57392: getting the remaining hosts for this loop 46400 1727204566.57393: done getting the remaining hosts for this loop 46400 1727204566.57398: getting the next task for host managed-node2 46400 1727204566.57408: done getting next task for host managed-node2 46400 1727204566.57413: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204566.57419: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.57448: getting variables 46400 1727204566.57450: in VariableManager get_vars() 46400 1727204566.57493: Calling all_inventory to load vars for managed-node2 46400 1727204566.57497: Calling groups_inventory to load vars for managed-node2 46400 1727204566.57499: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.57512: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.57515: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.57518: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.58792: done sending task result for task 0affcd87-79f5-1303-fda8-000000001283 46400 1727204566.58796: WORKER PROCESS EXITING 46400 1727204566.59444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.61127: done with get_vars() 46400 1727204566.61160: done getting variables 46400 1727204566.61222: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.057) 0:00:56.897 ***** 46400 1727204566.61265: entering _queue_task() for managed-node2/package 46400 1727204566.61607: worker is 1 (out of 1 available) 46400 1727204566.61620: exiting _queue_task() for managed-node2/package 46400 1727204566.61633: done queuing things up, now waiting for results queue to drain 46400 1727204566.61635: waiting for pending results... 46400 1727204566.61942: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204566.62109: in run() - task 0affcd87-79f5-1303-fda8-000000001284 46400 1727204566.62131: variable 'ansible_search_path' from source: unknown 46400 1727204566.62139: variable 'ansible_search_path' from source: unknown 46400 1727204566.62179: calling self._execute() 46400 1727204566.62283: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.62299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.62312: variable 'omit' from source: magic vars 46400 1727204566.62706: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.62722: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.62856: variable 'network_state' from source: role '' defaults 46400 1727204566.62876: Evaluated conditional (network_state != {}): False 46400 1727204566.62887: when evaluation is False, skipping this task 46400 1727204566.62894: _execute() done 46400 1727204566.62901: dumping result to json 46400 1727204566.62907: done dumping result, returning 46400 1727204566.62917: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001284] 46400 1727204566.62927: sending task result for task 0affcd87-79f5-1303-fda8-000000001284 46400 1727204566.63051: done sending task result for task 0affcd87-79f5-1303-fda8-000000001284 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204566.63098: no more pending results, returning what we have 46400 1727204566.63102: results queue empty 46400 1727204566.63104: checking for any_errors_fatal 46400 1727204566.63110: done checking for any_errors_fatal 46400 1727204566.63111: checking for max_fail_percentage 46400 1727204566.63113: done checking for max_fail_percentage 46400 1727204566.63114: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.63115: done checking to see if all hosts have failed 46400 1727204566.63115: getting the remaining hosts for this loop 46400 1727204566.63117: done getting the remaining hosts for this loop 46400 1727204566.63121: getting the next task for host managed-node2 46400 1727204566.63132: done getting next task for host managed-node2 46400 1727204566.63137: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204566.63143: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.63172: getting variables 46400 1727204566.63174: in VariableManager get_vars() 46400 1727204566.63214: Calling all_inventory to load vars for managed-node2 46400 1727204566.63217: Calling groups_inventory to load vars for managed-node2 46400 1727204566.63219: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.63233: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.63236: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.63240: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.64203: WORKER PROCESS EXITING 46400 1727204566.64954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.66863: done with get_vars() 46400 1727204566.66888: done getting variables 46400 1727204566.66957: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.057) 0:00:56.954 ***** 46400 1727204566.66998: entering _queue_task() for managed-node2/service 46400 1727204566.67349: worker is 1 (out of 1 available) 46400 1727204566.67363: exiting _queue_task() for managed-node2/service 46400 1727204566.67382: done queuing things up, now waiting for results queue to drain 46400 1727204566.67384: waiting for pending results... 46400 1727204566.67704: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204566.67856: in run() - task 0affcd87-79f5-1303-fda8-000000001285 46400 1727204566.67879: variable 'ansible_search_path' from source: unknown 46400 1727204566.67887: variable 'ansible_search_path' from source: unknown 46400 1727204566.67926: calling self._execute() 46400 1727204566.68022: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.68032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.68051: variable 'omit' from source: magic vars 46400 1727204566.68420: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.68436: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.68571: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.68787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204566.71296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204566.71384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204566.71438: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204566.71482: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204566.71556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204566.71643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.71678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.71708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.71758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.71779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.71829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.71869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.71899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.71942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.71969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.72013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.72040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.72077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.72119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.72135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.72326: variable 'network_connections' from source: include params 46400 1727204566.72342: variable 'interface' from source: play vars 46400 1727204566.72424: variable 'interface' from source: play vars 46400 1727204566.72508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204566.72684: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204566.72744: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204566.72779: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204566.72813: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204566.72866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204566.72897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204566.72935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.72968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204566.73019: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204566.73417: variable 'network_connections' from source: include params 46400 1727204566.73427: variable 'interface' from source: play vars 46400 1727204566.73491: variable 'interface' from source: play vars 46400 1727204566.73527: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204566.73535: when evaluation is False, skipping this task 46400 1727204566.73542: _execute() done 46400 1727204566.73548: dumping result to json 46400 1727204566.73556: done dumping result, returning 46400 1727204566.73571: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001285] 46400 1727204566.73581: sending task result for task 0affcd87-79f5-1303-fda8-000000001285 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204566.73737: no more pending results, returning what we have 46400 1727204566.73741: results queue empty 46400 1727204566.73742: checking for any_errors_fatal 46400 1727204566.73750: done checking for any_errors_fatal 46400 1727204566.73751: checking for max_fail_percentage 46400 1727204566.73753: done checking for max_fail_percentage 46400 1727204566.73754: checking to see if all hosts have failed and the running result is not ok 46400 1727204566.73755: done checking to see if all hosts have failed 46400 1727204566.73756: getting the remaining hosts for this loop 46400 1727204566.73757: done getting the remaining hosts for this loop 46400 1727204566.73761: getting the next task for host managed-node2 46400 1727204566.73774: done getting next task for host managed-node2 46400 1727204566.73779: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204566.73784: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204566.73811: getting variables 46400 1727204566.73813: in VariableManager get_vars() 46400 1727204566.73854: Calling all_inventory to load vars for managed-node2 46400 1727204566.73857: Calling groups_inventory to load vars for managed-node2 46400 1727204566.73859: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204566.73873: Calling all_plugins_play to load vars for managed-node2 46400 1727204566.73876: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204566.73879: Calling groups_plugins_play to load vars for managed-node2 46400 1727204566.74984: done sending task result for task 0affcd87-79f5-1303-fda8-000000001285 46400 1727204566.74998: WORKER PROCESS EXITING 46400 1727204566.75674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204566.78318: done with get_vars() 46400 1727204566.78354: done getting variables 46400 1727204566.78440: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:46 -0400 (0:00:00.114) 0:00:57.069 ***** 46400 1727204566.78478: entering _queue_task() for managed-node2/service 46400 1727204566.78855: worker is 1 (out of 1 available) 46400 1727204566.78873: exiting _queue_task() for managed-node2/service 46400 1727204566.78890: done queuing things up, now waiting for results queue to drain 46400 1727204566.78892: waiting for pending results... 46400 1727204566.79203: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204566.79361: in run() - task 0affcd87-79f5-1303-fda8-000000001286 46400 1727204566.79385: variable 'ansible_search_path' from source: unknown 46400 1727204566.79398: variable 'ansible_search_path' from source: unknown 46400 1727204566.79441: calling self._execute() 46400 1727204566.79546: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.79566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.79581: variable 'omit' from source: magic vars 46400 1727204566.79967: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.79985: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204566.80103: variable 'network_provider' from source: set_fact 46400 1727204566.80107: variable 'network_state' from source: role '' defaults 46400 1727204566.80116: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204566.80122: variable 'omit' from source: magic vars 46400 1727204566.80168: variable 'omit' from source: magic vars 46400 1727204566.80188: variable 'network_service_name' from source: role '' defaults 46400 1727204566.80240: variable 'network_service_name' from source: role '' defaults 46400 1727204566.80317: variable '__network_provider_setup' from source: role '' defaults 46400 1727204566.80322: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204566.80371: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204566.80377: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204566.80422: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204566.80658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204566.83283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204566.83342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204566.83373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204566.83399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204566.83423: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204566.83487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.83507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.83530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.83557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.83572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.83604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.83622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.83643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.83673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.83683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.83839: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204566.83924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.83941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.83961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.83992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.84003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.84072: variable 'ansible_python' from source: facts 46400 1727204566.84086: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204566.84144: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204566.84205: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204566.84292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.84310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.84326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.84351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.84361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.84400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204566.84420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204566.84437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.84462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204566.84476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204566.85374: variable 'network_connections' from source: include params 46400 1727204566.85377: variable 'interface' from source: play vars 46400 1727204566.85380: variable 'interface' from source: play vars 46400 1727204566.85383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204566.85858: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204566.85920: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204566.85959: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204566.86005: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204566.86073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204566.86104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204566.86136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204566.86171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204566.86257: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.86572: variable 'network_connections' from source: include params 46400 1727204566.86577: variable 'interface' from source: play vars 46400 1727204566.86652: variable 'interface' from source: play vars 46400 1727204566.86690: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204566.86770: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204566.87062: variable 'network_connections' from source: include params 46400 1727204566.87071: variable 'interface' from source: play vars 46400 1727204566.87134: variable 'interface' from source: play vars 46400 1727204566.87155: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204566.87233: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204566.87495: variable 'network_connections' from source: include params 46400 1727204566.87498: variable 'interface' from source: play vars 46400 1727204566.87561: variable 'interface' from source: play vars 46400 1727204566.87612: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204566.87673: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204566.87676: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204566.87731: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204566.87924: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204566.89282: variable 'network_connections' from source: include params 46400 1727204566.89286: variable 'interface' from source: play vars 46400 1727204566.89348: variable 'interface' from source: play vars 46400 1727204566.89355: variable 'ansible_distribution' from source: facts 46400 1727204566.89358: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.89370: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.89386: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204566.89548: variable 'ansible_distribution' from source: facts 46400 1727204566.89551: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.89556: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.89574: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204566.89733: variable 'ansible_distribution' from source: facts 46400 1727204566.89736: variable '__network_rh_distros' from source: role '' defaults 46400 1727204566.89747: variable 'ansible_distribution_major_version' from source: facts 46400 1727204566.89830: variable 'network_provider' from source: set_fact 46400 1727204566.89876: variable 'omit' from source: magic vars 46400 1727204566.89913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204566.89940: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204566.89961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204566.89981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204566.90007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204566.90024: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204566.90027: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.90029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.90122: Set connection var ansible_shell_type to sh 46400 1727204566.90132: Set connection var ansible_shell_executable to /bin/sh 46400 1727204566.90137: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204566.90142: Set connection var ansible_connection to ssh 46400 1727204566.90148: Set connection var ansible_pipelining to False 46400 1727204566.90153: Set connection var ansible_timeout to 10 46400 1727204566.90187: variable 'ansible_shell_executable' from source: unknown 46400 1727204566.90190: variable 'ansible_connection' from source: unknown 46400 1727204566.90194: variable 'ansible_module_compression' from source: unknown 46400 1727204566.90196: variable 'ansible_shell_type' from source: unknown 46400 1727204566.90198: variable 'ansible_shell_executable' from source: unknown 46400 1727204566.90201: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204566.90204: variable 'ansible_pipelining' from source: unknown 46400 1727204566.90207: variable 'ansible_timeout' from source: unknown 46400 1727204566.90209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204566.90314: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204566.90324: variable 'omit' from source: magic vars 46400 1727204566.90334: starting attempt loop 46400 1727204566.90336: running the handler 46400 1727204566.90410: variable 'ansible_facts' from source: unknown 46400 1727204566.91212: _low_level_execute_command(): starting 46400 1727204566.91219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204566.92106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204566.92110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.92113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.92115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.92118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.92120: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204566.92123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.92125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204566.92127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204566.92129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204566.92131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.92133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.92412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.92415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.92417: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204566.92420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.92422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204566.92424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204566.92425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204566.92427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204566.93997: stdout chunk (state=3): >>>/root <<< 46400 1727204566.94166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204566.94174: stdout chunk (state=3): >>><<< 46400 1727204566.94184: stderr chunk (state=3): >>><<< 46400 1727204566.94204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204566.94218: _low_level_execute_command(): starting 46400 1727204566.94224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021 `" && echo ansible-tmp-1727204566.9420552-50120-43625976227021="` echo /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021 `" ) && sleep 0' 46400 1727204566.94951: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204566.94964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.94976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.94986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.95020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.95027: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204566.95037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.95051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204566.95058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204566.95066: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204566.95084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.95090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.95101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.95109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.95115: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204566.95125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.95200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204566.95208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204566.95211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204566.95285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204566.97140: stdout chunk (state=3): >>>ansible-tmp-1727204566.9420552-50120-43625976227021=/root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021 <<< 46400 1727204566.97350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204566.97354: stdout chunk (state=3): >>><<< 46400 1727204566.97357: stderr chunk (state=3): >>><<< 46400 1727204566.97578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204566.9420552-50120-43625976227021=/root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204566.97587: variable 'ansible_module_compression' from source: unknown 46400 1727204566.97590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204566.97592: variable 'ansible_facts' from source: unknown 46400 1727204566.97801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/AnsiballZ_systemd.py 46400 1727204566.97977: Sending initial data 46400 1727204566.97981: Sent initial data (155 bytes) 46400 1727204566.99070: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204566.99088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.99114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.99135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.99181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.99195: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204566.99215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.99241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204566.99254: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204566.99271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204566.99286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204566.99300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204566.99319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204566.99342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204566.99355: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204566.99377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204566.99471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204566.99495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204566.99513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204566.99591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.01336: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204567.01420: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204567.01425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp70yfo814 /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/AnsiballZ_systemd.py <<< 46400 1727204567.01622: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204567.03552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.03671: stderr chunk (state=3): >>><<< 46400 1727204567.03675: stdout chunk (state=3): >>><<< 46400 1727204567.03678: done transferring module to remote 46400 1727204567.03687: _low_level_execute_command(): starting 46400 1727204567.03693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/ /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/AnsiballZ_systemd.py && sleep 0' 46400 1727204567.04158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.04166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.04199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204567.04203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.04206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204567.04208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.04255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.04262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.04311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.06006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.06071: stderr chunk (state=3): >>><<< 46400 1727204567.06075: stdout chunk (state=3): >>><<< 46400 1727204567.06088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204567.06091: _low_level_execute_command(): starting 46400 1727204567.06097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/AnsiballZ_systemd.py && sleep 0' 46400 1727204567.06546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.06549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.06601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.06605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.06608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204567.06617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.06669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.06682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.06687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.06731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.31656: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204567.31707: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "7020544", "MemoryAvailable": "infinity", "CPUUsageNSec": "2125784000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204567.33398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204567.33402: stdout chunk (state=3): >>><<< 46400 1727204567.33404: stderr chunk (state=3): >>><<< 46400 1727204567.33591: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "7020544", "MemoryAvailable": "infinity", "CPUUsageNSec": "2125784000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204567.33681: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204567.33685: _low_level_execute_command(): starting 46400 1727204567.33698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204566.9420552-50120-43625976227021/ > /dev/null 2>&1 && sleep 0' 46400 1727204567.36074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204567.36184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.36193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.36208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.36253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.36262: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204567.36272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.36287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204567.36294: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204567.36301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204567.36337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.36346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.36357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.36366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.36375: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204567.36384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.36577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.36597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.36610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.36688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.38595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.38600: stdout chunk (state=3): >>><<< 46400 1727204567.38606: stderr chunk (state=3): >>><<< 46400 1727204567.38630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204567.38635: handler run complete 46400 1727204567.38700: attempt loop complete, returning result 46400 1727204567.38703: _execute() done 46400 1727204567.38706: dumping result to json 46400 1727204567.38725: done dumping result, returning 46400 1727204567.38735: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000001286] 46400 1727204567.38740: sending task result for task 0affcd87-79f5-1303-fda8-000000001286 46400 1727204567.39047: done sending task result for task 0affcd87-79f5-1303-fda8-000000001286 46400 1727204567.39051: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204567.39118: no more pending results, returning what we have 46400 1727204567.39122: results queue empty 46400 1727204567.39123: checking for any_errors_fatal 46400 1727204567.39130: done checking for any_errors_fatal 46400 1727204567.39131: checking for max_fail_percentage 46400 1727204567.39133: done checking for max_fail_percentage 46400 1727204567.39134: checking to see if all hosts have failed and the running result is not ok 46400 1727204567.39135: done checking to see if all hosts have failed 46400 1727204567.39136: getting the remaining hosts for this loop 46400 1727204567.39137: done getting the remaining hosts for this loop 46400 1727204567.39142: getting the next task for host managed-node2 46400 1727204567.39151: done getting next task for host managed-node2 46400 1727204567.39156: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204567.39161: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204567.39188: getting variables 46400 1727204567.39191: in VariableManager get_vars() 46400 1727204567.39229: Calling all_inventory to load vars for managed-node2 46400 1727204567.39232: Calling groups_inventory to load vars for managed-node2 46400 1727204567.39235: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204567.39247: Calling all_plugins_play to load vars for managed-node2 46400 1727204567.39250: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204567.39252: Calling groups_plugins_play to load vars for managed-node2 46400 1727204567.42450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204567.44392: done with get_vars() 46400 1727204567.44428: done getting variables 46400 1727204567.44498: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:47 -0400 (0:00:00.660) 0:00:57.729 ***** 46400 1727204567.44536: entering _queue_task() for managed-node2/service 46400 1727204567.45129: worker is 1 (out of 1 available) 46400 1727204567.45142: exiting _queue_task() for managed-node2/service 46400 1727204567.45164: done queuing things up, now waiting for results queue to drain 46400 1727204567.45167: waiting for pending results... 46400 1727204567.45475: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204567.45740: in run() - task 0affcd87-79f5-1303-fda8-000000001287 46400 1727204567.45752: variable 'ansible_search_path' from source: unknown 46400 1727204567.45756: variable 'ansible_search_path' from source: unknown 46400 1727204567.45795: calling self._execute() 46400 1727204567.46010: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.46016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.46026: variable 'omit' from source: magic vars 46400 1727204567.46734: variable 'ansible_distribution_major_version' from source: facts 46400 1727204567.46745: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204567.46871: variable 'network_provider' from source: set_fact 46400 1727204567.46875: Evaluated conditional (network_provider == "nm"): True 46400 1727204567.46974: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204567.47180: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204567.47416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204567.50394: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204567.50471: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204567.50510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204567.50553: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204567.50582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204567.50768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204567.50795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204567.50819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204567.50869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204567.50887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204567.50932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204567.50967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204567.50992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204567.51031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204567.51044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204567.51203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204567.51254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204567.52651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204567.52674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204567.52694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204567.52861: variable 'network_connections' from source: include params 46400 1727204567.52880: variable 'interface' from source: play vars 46400 1727204567.53102: variable 'interface' from source: play vars 46400 1727204567.53283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204567.53411: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204567.53450: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204567.53488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204567.53557: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204567.53689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204567.53711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204567.53855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204567.53886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204567.53934: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204567.54606: variable 'network_connections' from source: include params 46400 1727204567.54612: variable 'interface' from source: play vars 46400 1727204567.54684: variable 'interface' from source: play vars 46400 1727204567.54791: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204567.54795: when evaluation is False, skipping this task 46400 1727204567.54797: _execute() done 46400 1727204567.54800: dumping result to json 46400 1727204567.54802: done dumping result, returning 46400 1727204567.54811: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000001287] 46400 1727204567.54943: sending task result for task 0affcd87-79f5-1303-fda8-000000001287 46400 1727204567.55045: done sending task result for task 0affcd87-79f5-1303-fda8-000000001287 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204567.55099: no more pending results, returning what we have 46400 1727204567.55104: results queue empty 46400 1727204567.55105: checking for any_errors_fatal 46400 1727204567.55145: done checking for any_errors_fatal 46400 1727204567.55147: checking for max_fail_percentage 46400 1727204567.55149: done checking for max_fail_percentage 46400 1727204567.55150: checking to see if all hosts have failed and the running result is not ok 46400 1727204567.55151: done checking to see if all hosts have failed 46400 1727204567.55152: getting the remaining hosts for this loop 46400 1727204567.55154: done getting the remaining hosts for this loop 46400 1727204567.55158: getting the next task for host managed-node2 46400 1727204567.55188: done getting next task for host managed-node2 46400 1727204567.55193: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204567.55199: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204567.55227: getting variables 46400 1727204567.55229: in VariableManager get_vars() 46400 1727204567.55280: Calling all_inventory to load vars for managed-node2 46400 1727204567.55287: Calling groups_inventory to load vars for managed-node2 46400 1727204567.55291: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204567.55303: Calling all_plugins_play to load vars for managed-node2 46400 1727204567.55306: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204567.55310: Calling groups_plugins_play to load vars for managed-node2 46400 1727204567.55934: WORKER PROCESS EXITING 46400 1727204567.57204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204567.59875: done with get_vars() 46400 1727204567.59901: done getting variables 46400 1727204567.59956: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:47 -0400 (0:00:00.154) 0:00:57.884 ***** 46400 1727204567.59987: entering _queue_task() for managed-node2/service 46400 1727204567.60232: worker is 1 (out of 1 available) 46400 1727204567.60247: exiting _queue_task() for managed-node2/service 46400 1727204567.60263: done queuing things up, now waiting for results queue to drain 46400 1727204567.60266: waiting for pending results... 46400 1727204567.60455: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204567.60565: in run() - task 0affcd87-79f5-1303-fda8-000000001288 46400 1727204567.60574: variable 'ansible_search_path' from source: unknown 46400 1727204567.60579: variable 'ansible_search_path' from source: unknown 46400 1727204567.60609: calling self._execute() 46400 1727204567.60687: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.60691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.60700: variable 'omit' from source: magic vars 46400 1727204567.60988: variable 'ansible_distribution_major_version' from source: facts 46400 1727204567.60997: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204567.61080: variable 'network_provider' from source: set_fact 46400 1727204567.61086: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204567.61089: when evaluation is False, skipping this task 46400 1727204567.61092: _execute() done 46400 1727204567.61094: dumping result to json 46400 1727204567.61097: done dumping result, returning 46400 1727204567.61103: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000001288] 46400 1727204567.61109: sending task result for task 0affcd87-79f5-1303-fda8-000000001288 46400 1727204567.61209: done sending task result for task 0affcd87-79f5-1303-fda8-000000001288 46400 1727204567.61212: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204567.61252: no more pending results, returning what we have 46400 1727204567.61257: results queue empty 46400 1727204567.61258: checking for any_errors_fatal 46400 1727204567.61272: done checking for any_errors_fatal 46400 1727204567.61273: checking for max_fail_percentage 46400 1727204567.61280: done checking for max_fail_percentage 46400 1727204567.61281: checking to see if all hosts have failed and the running result is not ok 46400 1727204567.61282: done checking to see if all hosts have failed 46400 1727204567.61283: getting the remaining hosts for this loop 46400 1727204567.61284: done getting the remaining hosts for this loop 46400 1727204567.61289: getting the next task for host managed-node2 46400 1727204567.61298: done getting next task for host managed-node2 46400 1727204567.61302: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204567.61308: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204567.61331: getting variables 46400 1727204567.61333: in VariableManager get_vars() 46400 1727204567.61372: Calling all_inventory to load vars for managed-node2 46400 1727204567.61375: Calling groups_inventory to load vars for managed-node2 46400 1727204567.61377: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204567.61393: Calling all_plugins_play to load vars for managed-node2 46400 1727204567.61396: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204567.61399: Calling groups_plugins_play to load vars for managed-node2 46400 1727204567.63308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204567.66575: done with get_vars() 46400 1727204567.66611: done getting variables 46400 1727204567.66676: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:47 -0400 (0:00:00.067) 0:00:57.951 ***** 46400 1727204567.66705: entering _queue_task() for managed-node2/copy 46400 1727204567.66967: worker is 1 (out of 1 available) 46400 1727204567.66980: exiting _queue_task() for managed-node2/copy 46400 1727204567.66993: done queuing things up, now waiting for results queue to drain 46400 1727204567.66995: waiting for pending results... 46400 1727204567.67203: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204567.67323: in run() - task 0affcd87-79f5-1303-fda8-000000001289 46400 1727204567.67332: variable 'ansible_search_path' from source: unknown 46400 1727204567.67336: variable 'ansible_search_path' from source: unknown 46400 1727204567.67370: calling self._execute() 46400 1727204567.67443: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.67449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.67456: variable 'omit' from source: magic vars 46400 1727204567.67740: variable 'ansible_distribution_major_version' from source: facts 46400 1727204567.67751: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204567.67836: variable 'network_provider' from source: set_fact 46400 1727204567.67840: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204567.67843: when evaluation is False, skipping this task 46400 1727204567.67846: _execute() done 46400 1727204567.67849: dumping result to json 46400 1727204567.67851: done dumping result, returning 46400 1727204567.67860: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000001289] 46400 1727204567.67870: sending task result for task 0affcd87-79f5-1303-fda8-000000001289 46400 1727204567.67965: done sending task result for task 0affcd87-79f5-1303-fda8-000000001289 46400 1727204567.67968: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204567.68018: no more pending results, returning what we have 46400 1727204567.68022: results queue empty 46400 1727204567.68023: checking for any_errors_fatal 46400 1727204567.68031: done checking for any_errors_fatal 46400 1727204567.68031: checking for max_fail_percentage 46400 1727204567.68033: done checking for max_fail_percentage 46400 1727204567.68034: checking to see if all hosts have failed and the running result is not ok 46400 1727204567.68035: done checking to see if all hosts have failed 46400 1727204567.68036: getting the remaining hosts for this loop 46400 1727204567.68037: done getting the remaining hosts for this loop 46400 1727204567.68042: getting the next task for host managed-node2 46400 1727204567.68053: done getting next task for host managed-node2 46400 1727204567.68056: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204567.68062: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204567.68275: getting variables 46400 1727204567.68278: in VariableManager get_vars() 46400 1727204567.68322: Calling all_inventory to load vars for managed-node2 46400 1727204567.68329: Calling groups_inventory to load vars for managed-node2 46400 1727204567.68332: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204567.68345: Calling all_plugins_play to load vars for managed-node2 46400 1727204567.68348: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204567.68351: Calling groups_plugins_play to load vars for managed-node2 46400 1727204567.69929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204567.71833: done with get_vars() 46400 1727204567.71852: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:47 -0400 (0:00:00.052) 0:00:58.003 ***** 46400 1727204567.71929: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204567.72266: worker is 1 (out of 1 available) 46400 1727204567.72284: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204567.72298: done queuing things up, now waiting for results queue to drain 46400 1727204567.72304: waiting for pending results... 46400 1727204567.72565: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204567.72774: in run() - task 0affcd87-79f5-1303-fda8-00000000128a 46400 1727204567.72779: variable 'ansible_search_path' from source: unknown 46400 1727204567.72782: variable 'ansible_search_path' from source: unknown 46400 1727204567.72786: calling self._execute() 46400 1727204567.73056: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.73060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.73065: variable 'omit' from source: magic vars 46400 1727204567.73221: variable 'ansible_distribution_major_version' from source: facts 46400 1727204567.73232: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204567.73239: variable 'omit' from source: magic vars 46400 1727204567.73306: variable 'omit' from source: magic vars 46400 1727204567.73463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204567.76844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204567.76902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204567.76933: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204567.76959: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204567.76985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204567.77053: variable 'network_provider' from source: set_fact 46400 1727204567.77160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204567.77184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204567.77203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204567.77231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204567.77243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204567.77306: variable 'omit' from source: magic vars 46400 1727204567.77386: variable 'omit' from source: magic vars 46400 1727204567.77480: variable 'network_connections' from source: include params 46400 1727204567.77489: variable 'interface' from source: play vars 46400 1727204567.77534: variable 'interface' from source: play vars 46400 1727204567.77642: variable 'omit' from source: magic vars 46400 1727204567.77649: variable '__lsr_ansible_managed' from source: task vars 46400 1727204567.77697: variable '__lsr_ansible_managed' from source: task vars 46400 1727204567.77841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204567.78176: Loaded config def from plugin (lookup/template) 46400 1727204567.78293: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204567.78296: File lookup term: get_ansible_managed.j2 46400 1727204567.78299: variable 'ansible_search_path' from source: unknown 46400 1727204567.78302: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204567.78407: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204567.78413: variable 'ansible_search_path' from source: unknown 46400 1727204567.83240: variable 'ansible_managed' from source: unknown 46400 1727204567.83385: variable 'omit' from source: magic vars 46400 1727204567.83417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204567.83438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204567.83454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204567.83473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204567.83498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204567.83516: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204567.83519: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.83522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.83590: Set connection var ansible_shell_type to sh 46400 1727204567.83598: Set connection var ansible_shell_executable to /bin/sh 46400 1727204567.83603: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204567.83608: Set connection var ansible_connection to ssh 46400 1727204567.83615: Set connection var ansible_pipelining to False 46400 1727204567.83620: Set connection var ansible_timeout to 10 46400 1727204567.83640: variable 'ansible_shell_executable' from source: unknown 46400 1727204567.83642: variable 'ansible_connection' from source: unknown 46400 1727204567.83645: variable 'ansible_module_compression' from source: unknown 46400 1727204567.83647: variable 'ansible_shell_type' from source: unknown 46400 1727204567.83650: variable 'ansible_shell_executable' from source: unknown 46400 1727204567.83653: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204567.83655: variable 'ansible_pipelining' from source: unknown 46400 1727204567.83657: variable 'ansible_timeout' from source: unknown 46400 1727204567.83662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204567.83761: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204567.83775: variable 'omit' from source: magic vars 46400 1727204567.83780: starting attempt loop 46400 1727204567.83784: running the handler 46400 1727204567.83799: _low_level_execute_command(): starting 46400 1727204567.83805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204567.84328: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.84342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.84381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204567.84428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.84442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.84514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.84517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.84519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.84580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.86269: stdout chunk (state=3): >>>/root <<< 46400 1727204567.86374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.86467: stderr chunk (state=3): >>><<< 46400 1727204567.86478: stdout chunk (state=3): >>><<< 46400 1727204567.86503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204567.86513: _low_level_execute_command(): starting 46400 1727204567.86520: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828 `" && echo ansible-tmp-1727204567.8650284-50240-181567285214828="` echo /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828 `" ) && sleep 0' 46400 1727204567.87228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.87233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.87265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.87303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.87311: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204567.87331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.87342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.87365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.87370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.87378: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204567.87388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.87478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.87510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.87514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.87588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.89597: stdout chunk (state=3): >>>ansible-tmp-1727204567.8650284-50240-181567285214828=/root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828 <<< 46400 1727204567.89709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.89851: stderr chunk (state=3): >>><<< 46400 1727204567.89882: stdout chunk (state=3): >>><<< 46400 1727204567.89976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204567.8650284-50240-181567285214828=/root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204567.90023: variable 'ansible_module_compression' from source: unknown 46400 1727204567.90045: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204567.90075: variable 'ansible_facts' from source: unknown 46400 1727204567.90193: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/AnsiballZ_network_connections.py 46400 1727204567.90394: Sending initial data 46400 1727204567.90398: Sent initial data (168 bytes) 46400 1727204567.91333: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204567.91339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.91349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.91360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.91480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.91494: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204567.91498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.91504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204567.91540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204567.91594: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204567.91597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.91599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.91601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.91604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.91606: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204567.91675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.91709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.91738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.91743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.91798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.93699: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204567.93744: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204567.93826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp8v3489fa /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/AnsiballZ_network_connections.py <<< 46400 1727204567.93831: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204567.95723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.95931: stderr chunk (state=3): >>><<< 46400 1727204567.95935: stdout chunk (state=3): >>><<< 46400 1727204567.95937: done transferring module to remote 46400 1727204567.95939: _low_level_execute_command(): starting 46400 1727204567.95941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/ /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/AnsiballZ_network_connections.py && sleep 0' 46400 1727204567.96575: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204567.96602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.96616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.96633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.96680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.96702: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204567.96723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.96740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204567.96772: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204567.96792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204567.96805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204567.96818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204567.96835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204567.96850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204567.96868: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204567.96883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204567.96956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204567.96986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204567.97006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204567.97078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204567.98971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204567.98999: stderr chunk (state=3): >>><<< 46400 1727204567.99003: stdout chunk (state=3): >>><<< 46400 1727204567.99101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204567.99105: _low_level_execute_command(): starting 46400 1727204567.99108: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/AnsiballZ_network_connections.py && sleep 0' 46400 1727204568.00703: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204568.00718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.00733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.00755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.00801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204568.00812: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204568.00825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.00843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204568.00855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204568.00867: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204568.00879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.00892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.00906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.00916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204568.00926: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204568.00938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.01016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204568.01039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204568.01057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.01138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.23051: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204568.24440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204568.24501: stderr chunk (state=3): >>><<< 46400 1727204568.24505: stdout chunk (state=3): >>><<< 46400 1727204568.24520: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204568.24549: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204568.24561: _low_level_execute_command(): starting 46400 1727204568.24569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204567.8650284-50240-181567285214828/ > /dev/null 2>&1 && sleep 0' 46400 1727204568.25037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.25041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.25080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204568.25084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204568.25093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204568.25100: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.25117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204568.25120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.25172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204568.25184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.25240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.27061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.27119: stderr chunk (state=3): >>><<< 46400 1727204568.27123: stdout chunk (state=3): >>><<< 46400 1727204568.27138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204568.27153: handler run complete 46400 1727204568.27176: attempt loop complete, returning result 46400 1727204568.27179: _execute() done 46400 1727204568.27182: dumping result to json 46400 1727204568.27186: done dumping result, returning 46400 1727204568.27195: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-00000000128a] 46400 1727204568.27199: sending task result for task 0affcd87-79f5-1303-fda8-00000000128a 46400 1727204568.27308: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128a 46400 1727204568.27310: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active 46400 1727204568.27406: no more pending results, returning what we have 46400 1727204568.27410: results queue empty 46400 1727204568.27411: checking for any_errors_fatal 46400 1727204568.27419: done checking for any_errors_fatal 46400 1727204568.27420: checking for max_fail_percentage 46400 1727204568.27421: done checking for max_fail_percentage 46400 1727204568.27422: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.27423: done checking to see if all hosts have failed 46400 1727204568.27424: getting the remaining hosts for this loop 46400 1727204568.27425: done getting the remaining hosts for this loop 46400 1727204568.27429: getting the next task for host managed-node2 46400 1727204568.27436: done getting next task for host managed-node2 46400 1727204568.27441: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204568.27446: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.27459: getting variables 46400 1727204568.27463: in VariableManager get_vars() 46400 1727204568.27503: Calling all_inventory to load vars for managed-node2 46400 1727204568.27505: Calling groups_inventory to load vars for managed-node2 46400 1727204568.27508: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.27517: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.27520: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.27522: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.28387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.29334: done with get_vars() 46400 1727204568.29353: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.574) 0:00:58.578 ***** 46400 1727204568.29423: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204568.29675: worker is 1 (out of 1 available) 46400 1727204568.29690: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204568.29702: done queuing things up, now waiting for results queue to drain 46400 1727204568.29704: waiting for pending results... 46400 1727204568.29901: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204568.29997: in run() - task 0affcd87-79f5-1303-fda8-00000000128b 46400 1727204568.30008: variable 'ansible_search_path' from source: unknown 46400 1727204568.30011: variable 'ansible_search_path' from source: unknown 46400 1727204568.30041: calling self._execute() 46400 1727204568.30116: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.30119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.30127: variable 'omit' from source: magic vars 46400 1727204568.30410: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.30418: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.30504: variable 'network_state' from source: role '' defaults 46400 1727204568.30518: Evaluated conditional (network_state != {}): False 46400 1727204568.30521: when evaluation is False, skipping this task 46400 1727204568.30524: _execute() done 46400 1727204568.30526: dumping result to json 46400 1727204568.30529: done dumping result, returning 46400 1727204568.30532: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-00000000128b] 46400 1727204568.30534: sending task result for task 0affcd87-79f5-1303-fda8-00000000128b 46400 1727204568.30627: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128b 46400 1727204568.30629: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204568.30685: no more pending results, returning what we have 46400 1727204568.30689: results queue empty 46400 1727204568.30690: checking for any_errors_fatal 46400 1727204568.30706: done checking for any_errors_fatal 46400 1727204568.30707: checking for max_fail_percentage 46400 1727204568.30708: done checking for max_fail_percentage 46400 1727204568.30709: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.30710: done checking to see if all hosts have failed 46400 1727204568.30711: getting the remaining hosts for this loop 46400 1727204568.30712: done getting the remaining hosts for this loop 46400 1727204568.30716: getting the next task for host managed-node2 46400 1727204568.30725: done getting next task for host managed-node2 46400 1727204568.30730: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204568.30734: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.30766: getting variables 46400 1727204568.30767: in VariableManager get_vars() 46400 1727204568.30800: Calling all_inventory to load vars for managed-node2 46400 1727204568.30803: Calling groups_inventory to load vars for managed-node2 46400 1727204568.30805: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.30814: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.30816: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.30819: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.31813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.32732: done with get_vars() 46400 1727204568.32753: done getting variables 46400 1727204568.32803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.034) 0:00:58.612 ***** 46400 1727204568.32831: entering _queue_task() for managed-node2/debug 46400 1727204568.33086: worker is 1 (out of 1 available) 46400 1727204568.33101: exiting _queue_task() for managed-node2/debug 46400 1727204568.33112: done queuing things up, now waiting for results queue to drain 46400 1727204568.33114: waiting for pending results... 46400 1727204568.33313: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204568.33414: in run() - task 0affcd87-79f5-1303-fda8-00000000128c 46400 1727204568.33433: variable 'ansible_search_path' from source: unknown 46400 1727204568.33437: variable 'ansible_search_path' from source: unknown 46400 1727204568.33471: calling self._execute() 46400 1727204568.33543: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.33546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.33555: variable 'omit' from source: magic vars 46400 1727204568.33874: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.33884: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.33892: variable 'omit' from source: magic vars 46400 1727204568.33937: variable 'omit' from source: magic vars 46400 1727204568.33961: variable 'omit' from source: magic vars 46400 1727204568.33999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204568.34031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204568.34051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204568.34069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.34081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.34104: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204568.34107: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.34111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.34181: Set connection var ansible_shell_type to sh 46400 1727204568.34189: Set connection var ansible_shell_executable to /bin/sh 46400 1727204568.34194: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204568.34199: Set connection var ansible_connection to ssh 46400 1727204568.34204: Set connection var ansible_pipelining to False 46400 1727204568.34209: Set connection var ansible_timeout to 10 46400 1727204568.34229: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.34233: variable 'ansible_connection' from source: unknown 46400 1727204568.34236: variable 'ansible_module_compression' from source: unknown 46400 1727204568.34239: variable 'ansible_shell_type' from source: unknown 46400 1727204568.34241: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.34243: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.34246: variable 'ansible_pipelining' from source: unknown 46400 1727204568.34248: variable 'ansible_timeout' from source: unknown 46400 1727204568.34250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.34381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204568.34390: variable 'omit' from source: magic vars 46400 1727204568.34395: starting attempt loop 46400 1727204568.34398: running the handler 46400 1727204568.34501: variable '__network_connections_result' from source: set_fact 46400 1727204568.34542: handler run complete 46400 1727204568.34557: attempt loop complete, returning result 46400 1727204568.34560: _execute() done 46400 1727204568.34562: dumping result to json 46400 1727204568.34569: done dumping result, returning 46400 1727204568.34578: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-00000000128c] 46400 1727204568.34585: sending task result for task 0affcd87-79f5-1303-fda8-00000000128c 46400 1727204568.34943: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128c 46400 1727204568.34946: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active" ] } 46400 1727204568.35013: no more pending results, returning what we have 46400 1727204568.35016: results queue empty 46400 1727204568.35017: checking for any_errors_fatal 46400 1727204568.35024: done checking for any_errors_fatal 46400 1727204568.35024: checking for max_fail_percentage 46400 1727204568.35026: done checking for max_fail_percentage 46400 1727204568.35027: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.35028: done checking to see if all hosts have failed 46400 1727204568.35029: getting the remaining hosts for this loop 46400 1727204568.35030: done getting the remaining hosts for this loop 46400 1727204568.35033: getting the next task for host managed-node2 46400 1727204568.35041: done getting next task for host managed-node2 46400 1727204568.35044: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204568.35050: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.35067: getting variables 46400 1727204568.35068: in VariableManager get_vars() 46400 1727204568.35108: Calling all_inventory to load vars for managed-node2 46400 1727204568.35111: Calling groups_inventory to load vars for managed-node2 46400 1727204568.35114: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.35124: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.35127: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.35130: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.36353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.37298: done with get_vars() 46400 1727204568.37323: done getting variables 46400 1727204568.37376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.045) 0:00:58.658 ***** 46400 1727204568.37408: entering _queue_task() for managed-node2/debug 46400 1727204568.37665: worker is 1 (out of 1 available) 46400 1727204568.37679: exiting _queue_task() for managed-node2/debug 46400 1727204568.37693: done queuing things up, now waiting for results queue to drain 46400 1727204568.37694: waiting for pending results... 46400 1727204568.37920: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204568.38086: in run() - task 0affcd87-79f5-1303-fda8-00000000128d 46400 1727204568.38110: variable 'ansible_search_path' from source: unknown 46400 1727204568.38117: variable 'ansible_search_path' from source: unknown 46400 1727204568.38155: calling self._execute() 46400 1727204568.38259: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.38276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.38290: variable 'omit' from source: magic vars 46400 1727204568.38688: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.38705: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.38716: variable 'omit' from source: magic vars 46400 1727204568.38791: variable 'omit' from source: magic vars 46400 1727204568.38827: variable 'omit' from source: magic vars 46400 1727204568.38881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204568.38920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204568.38947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204568.38977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.38991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.39020: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204568.39026: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.39032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.39134: Set connection var ansible_shell_type to sh 46400 1727204568.39150: Set connection var ansible_shell_executable to /bin/sh 46400 1727204568.39159: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204568.39173: Set connection var ansible_connection to ssh 46400 1727204568.39184: Set connection var ansible_pipelining to False 46400 1727204568.39192: Set connection var ansible_timeout to 10 46400 1727204568.39219: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.39227: variable 'ansible_connection' from source: unknown 46400 1727204568.39235: variable 'ansible_module_compression' from source: unknown 46400 1727204568.39242: variable 'ansible_shell_type' from source: unknown 46400 1727204568.39248: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.39254: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.39266: variable 'ansible_pipelining' from source: unknown 46400 1727204568.39273: variable 'ansible_timeout' from source: unknown 46400 1727204568.39280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.39430: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204568.39448: variable 'omit' from source: magic vars 46400 1727204568.39458: starting attempt loop 46400 1727204568.39470: running the handler 46400 1727204568.39525: variable '__network_connections_result' from source: set_fact 46400 1727204568.39616: variable '__network_connections_result' from source: set_fact 46400 1727204568.39736: handler run complete 46400 1727204568.39770: attempt loop complete, returning result 46400 1727204568.39777: _execute() done 46400 1727204568.39783: dumping result to json 46400 1727204568.39791: done dumping result, returning 46400 1727204568.39804: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-00000000128d] 46400 1727204568.39815: sending task result for task 0affcd87-79f5-1303-fda8-00000000128d ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 32d7bf17-3bad-4841-bdea-bee9f6832024 skipped because already active" ] } } 46400 1727204568.40031: no more pending results, returning what we have 46400 1727204568.40035: results queue empty 46400 1727204568.40036: checking for any_errors_fatal 46400 1727204568.40046: done checking for any_errors_fatal 46400 1727204568.40047: checking for max_fail_percentage 46400 1727204568.40049: done checking for max_fail_percentage 46400 1727204568.40050: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.40051: done checking to see if all hosts have failed 46400 1727204568.40051: getting the remaining hosts for this loop 46400 1727204568.40053: done getting the remaining hosts for this loop 46400 1727204568.40057: getting the next task for host managed-node2 46400 1727204568.40078: done getting next task for host managed-node2 46400 1727204568.40083: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204568.40089: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.40105: getting variables 46400 1727204568.40106: in VariableManager get_vars() 46400 1727204568.40147: Calling all_inventory to load vars for managed-node2 46400 1727204568.40151: Calling groups_inventory to load vars for managed-node2 46400 1727204568.40153: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.40169: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.40179: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.40182: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.41081: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128d 46400 1727204568.41084: WORKER PROCESS EXITING 46400 1727204568.42148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.45472: done with get_vars() 46400 1727204568.45514: done getting variables 46400 1727204568.45587: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.082) 0:00:58.740 ***** 46400 1727204568.45625: entering _queue_task() for managed-node2/debug 46400 1727204568.46068: worker is 1 (out of 1 available) 46400 1727204568.46080: exiting _queue_task() for managed-node2/debug 46400 1727204568.46094: done queuing things up, now waiting for results queue to drain 46400 1727204568.46095: waiting for pending results... 46400 1727204568.46453: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204568.46708: in run() - task 0affcd87-79f5-1303-fda8-00000000128e 46400 1727204568.46731: variable 'ansible_search_path' from source: unknown 46400 1727204568.46739: variable 'ansible_search_path' from source: unknown 46400 1727204568.46789: calling self._execute() 46400 1727204568.46896: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.46908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.47000: variable 'omit' from source: magic vars 46400 1727204568.47791: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.47810: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.48050: variable 'network_state' from source: role '' defaults 46400 1727204568.48192: Evaluated conditional (network_state != {}): False 46400 1727204568.48202: when evaluation is False, skipping this task 46400 1727204568.48209: _execute() done 46400 1727204568.48215: dumping result to json 46400 1727204568.48223: done dumping result, returning 46400 1727204568.48233: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-00000000128e] 46400 1727204568.48245: sending task result for task 0affcd87-79f5-1303-fda8-00000000128e skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204568.48412: no more pending results, returning what we have 46400 1727204568.48418: results queue empty 46400 1727204568.48420: checking for any_errors_fatal 46400 1727204568.48429: done checking for any_errors_fatal 46400 1727204568.48431: checking for max_fail_percentage 46400 1727204568.48433: done checking for max_fail_percentage 46400 1727204568.48434: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.48435: done checking to see if all hosts have failed 46400 1727204568.48435: getting the remaining hosts for this loop 46400 1727204568.48438: done getting the remaining hosts for this loop 46400 1727204568.48443: getting the next task for host managed-node2 46400 1727204568.48456: done getting next task for host managed-node2 46400 1727204568.48468: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204568.48476: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.48506: getting variables 46400 1727204568.48509: in VariableManager get_vars() 46400 1727204568.48555: Calling all_inventory to load vars for managed-node2 46400 1727204568.48558: Calling groups_inventory to load vars for managed-node2 46400 1727204568.48565: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.48579: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.48582: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.48585: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.49851: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128e 46400 1727204568.49857: WORKER PROCESS EXITING 46400 1727204568.50853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.53508: done with get_vars() 46400 1727204568.53534: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.080) 0:00:58.820 ***** 46400 1727204568.53637: entering _queue_task() for managed-node2/ping 46400 1727204568.53893: worker is 1 (out of 1 available) 46400 1727204568.53907: exiting _queue_task() for managed-node2/ping 46400 1727204568.53920: done queuing things up, now waiting for results queue to drain 46400 1727204568.53922: waiting for pending results... 46400 1727204568.54123: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204568.54221: in run() - task 0affcd87-79f5-1303-fda8-00000000128f 46400 1727204568.54231: variable 'ansible_search_path' from source: unknown 46400 1727204568.54235: variable 'ansible_search_path' from source: unknown 46400 1727204568.54274: calling self._execute() 46400 1727204568.54347: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.54353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.54361: variable 'omit' from source: magic vars 46400 1727204568.54653: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.54667: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.54674: variable 'omit' from source: magic vars 46400 1727204568.54721: variable 'omit' from source: magic vars 46400 1727204568.54745: variable 'omit' from source: magic vars 46400 1727204568.54783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204568.54813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204568.54866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204568.54870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.54875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204568.54901: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204568.54904: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.54906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.55287: Set connection var ansible_shell_type to sh 46400 1727204568.55291: Set connection var ansible_shell_executable to /bin/sh 46400 1727204568.55293: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204568.55296: Set connection var ansible_connection to ssh 46400 1727204568.55298: Set connection var ansible_pipelining to False 46400 1727204568.55300: Set connection var ansible_timeout to 10 46400 1727204568.55301: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.55304: variable 'ansible_connection' from source: unknown 46400 1727204568.55306: variable 'ansible_module_compression' from source: unknown 46400 1727204568.55308: variable 'ansible_shell_type' from source: unknown 46400 1727204568.55310: variable 'ansible_shell_executable' from source: unknown 46400 1727204568.55311: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.55313: variable 'ansible_pipelining' from source: unknown 46400 1727204568.55315: variable 'ansible_timeout' from source: unknown 46400 1727204568.55317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.55357: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204568.55361: variable 'omit' from source: magic vars 46400 1727204568.55363: starting attempt loop 46400 1727204568.55368: running the handler 46400 1727204568.55382: _low_level_execute_command(): starting 46400 1727204568.55385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204568.56112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204568.56125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.56135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.56151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.56199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204568.56207: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204568.56217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.56231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204568.56240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204568.56248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204568.56256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.56268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.56280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.56288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204568.56295: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204568.56307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.56415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204568.56435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204568.56447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.56521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.58173: stdout chunk (state=3): >>>/root <<< 46400 1727204568.58272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.58333: stderr chunk (state=3): >>><<< 46400 1727204568.58339: stdout chunk (state=3): >>><<< 46400 1727204568.58368: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204568.58379: _low_level_execute_command(): starting 46400 1727204568.58386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600 `" && echo ansible-tmp-1727204568.5836682-50383-200135919376600="` echo /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600 `" ) && sleep 0' 46400 1727204568.59494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.59644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.61237: stdout chunk (state=3): >>>ansible-tmp-1727204568.5836682-50383-200135919376600=/root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600 <<< 46400 1727204568.61354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.61409: stderr chunk (state=3): >>><<< 46400 1727204568.61412: stdout chunk (state=3): >>><<< 46400 1727204568.61428: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204568.5836682-50383-200135919376600=/root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204568.61470: variable 'ansible_module_compression' from source: unknown 46400 1727204568.61503: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204568.61533: variable 'ansible_facts' from source: unknown 46400 1727204568.61588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/AnsiballZ_ping.py 46400 1727204568.61699: Sending initial data 46400 1727204568.61702: Sent initial data (153 bytes) 46400 1727204568.62369: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.62380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.62430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.62433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.62436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.62556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204568.62559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.62562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.64253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204568.64290: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204568.64325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp384j4ecv /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/AnsiballZ_ping.py <<< 46400 1727204568.64362: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204568.65152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.65262: stderr chunk (state=3): >>><<< 46400 1727204568.65296: stdout chunk (state=3): >>><<< 46400 1727204568.65300: done transferring module to remote 46400 1727204568.65302: _low_level_execute_command(): starting 46400 1727204568.65304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/ /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/AnsiballZ_ping.py && sleep 0' 46400 1727204568.65756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.65760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.65798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204568.65804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204568.65809: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204568.65817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.65822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.65831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.65844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.65908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204568.65912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.65977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.67676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.67727: stderr chunk (state=3): >>><<< 46400 1727204568.67755: stdout chunk (state=3): >>><<< 46400 1727204568.67802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204568.67844: _low_level_execute_command(): starting 46400 1727204568.67917: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/AnsiballZ_ping.py && sleep 0' 46400 1727204568.68587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.68595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.68646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.68649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204568.68651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.68654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.68703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204568.68708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.68772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.81725: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204568.82738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204568.82790: stderr chunk (state=3): >>><<< 46400 1727204568.82794: stdout chunk (state=3): >>><<< 46400 1727204568.82810: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204568.82832: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204568.82840: _low_level_execute_command(): starting 46400 1727204568.82845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204568.5836682-50383-200135919376600/ > /dev/null 2>&1 && sleep 0' 46400 1727204568.83286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.83290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204568.83334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.83338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204568.83340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204568.83392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204568.83395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204568.83403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204568.83456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204568.85342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204568.85351: stdout chunk (state=3): >>><<< 46400 1727204568.85370: stderr chunk (state=3): >>><<< 46400 1727204568.85385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204568.85392: handler run complete 46400 1727204568.85408: attempt loop complete, returning result 46400 1727204568.85411: _execute() done 46400 1727204568.85421: dumping result to json 46400 1727204568.85430: done dumping result, returning 46400 1727204568.85445: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-00000000128f] 46400 1727204568.85452: sending task result for task 0affcd87-79f5-1303-fda8-00000000128f 46400 1727204568.85825: done sending task result for task 0affcd87-79f5-1303-fda8-00000000128f 46400 1727204568.85828: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204568.85906: no more pending results, returning what we have 46400 1727204568.85909: results queue empty 46400 1727204568.85910: checking for any_errors_fatal 46400 1727204568.85916: done checking for any_errors_fatal 46400 1727204568.85917: checking for max_fail_percentage 46400 1727204568.85919: done checking for max_fail_percentage 46400 1727204568.85920: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.85921: done checking to see if all hosts have failed 46400 1727204568.85922: getting the remaining hosts for this loop 46400 1727204568.85923: done getting the remaining hosts for this loop 46400 1727204568.85926: getting the next task for host managed-node2 46400 1727204568.85937: done getting next task for host managed-node2 46400 1727204568.85939: ^ task is: TASK: meta (role_complete) 46400 1727204568.85944: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.85957: getting variables 46400 1727204568.85959: in VariableManager get_vars() 46400 1727204568.86006: Calling all_inventory to load vars for managed-node2 46400 1727204568.86009: Calling groups_inventory to load vars for managed-node2 46400 1727204568.86012: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.86022: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.86025: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.86028: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.87311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.88268: done with get_vars() 46400 1727204568.88294: done getting variables 46400 1727204568.88390: done queuing things up, now waiting for results queue to drain 46400 1727204568.88392: results queue empty 46400 1727204568.88393: checking for any_errors_fatal 46400 1727204568.88396: done checking for any_errors_fatal 46400 1727204568.88397: checking for max_fail_percentage 46400 1727204568.88398: done checking for max_fail_percentage 46400 1727204568.88399: checking to see if all hosts have failed and the running result is not ok 46400 1727204568.88399: done checking to see if all hosts have failed 46400 1727204568.88400: getting the remaining hosts for this loop 46400 1727204568.88401: done getting the remaining hosts for this loop 46400 1727204568.88404: getting the next task for host managed-node2 46400 1727204568.88410: done getting next task for host managed-node2 46400 1727204568.88413: ^ task is: TASK: Test 46400 1727204568.88415: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204568.88418: getting variables 46400 1727204568.88419: in VariableManager get_vars() 46400 1727204568.88432: Calling all_inventory to load vars for managed-node2 46400 1727204568.88434: Calling groups_inventory to load vars for managed-node2 46400 1727204568.88437: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.88443: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.88445: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.88453: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.91208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.93008: done with get_vars() 46400 1727204568.93044: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:02:48 -0400 (0:00:00.402) 0:00:59.223 ***** 46400 1727204568.93853: entering _queue_task() for managed-node2/include_tasks 46400 1727204568.95013: worker is 1 (out of 1 available) 46400 1727204568.95033: exiting _queue_task() for managed-node2/include_tasks 46400 1727204568.95055: done queuing things up, now waiting for results queue to drain 46400 1727204568.95057: waiting for pending results... 46400 1727204568.95509: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204568.95682: in run() - task 0affcd87-79f5-1303-fda8-000000001009 46400 1727204568.95695: variable 'ansible_search_path' from source: unknown 46400 1727204568.95699: variable 'ansible_search_path' from source: unknown 46400 1727204568.95737: variable 'lsr_test' from source: include params 46400 1727204568.96023: variable 'lsr_test' from source: include params 46400 1727204568.96124: variable 'omit' from source: magic vars 46400 1727204568.96339: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204568.96365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204568.96393: variable 'omit' from source: magic vars 46400 1727204568.96707: variable 'ansible_distribution_major_version' from source: facts 46400 1727204568.96719: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204568.96725: variable 'item' from source: unknown 46400 1727204568.96777: variable 'item' from source: unknown 46400 1727204568.96802: variable 'item' from source: unknown 46400 1727204568.96859: variable 'item' from source: unknown 46400 1727204568.97046: dumping result to json 46400 1727204568.97049: done dumping result, returning 46400 1727204568.97051: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-000000001009] 46400 1727204568.97054: sending task result for task 0affcd87-79f5-1303-fda8-000000001009 46400 1727204568.97100: done sending task result for task 0affcd87-79f5-1303-fda8-000000001009 46400 1727204568.97103: WORKER PROCESS EXITING 46400 1727204568.97123: no more pending results, returning what we have 46400 1727204568.97128: in VariableManager get_vars() 46400 1727204568.97189: Calling all_inventory to load vars for managed-node2 46400 1727204568.97192: Calling groups_inventory to load vars for managed-node2 46400 1727204568.97195: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204568.97206: Calling all_plugins_play to load vars for managed-node2 46400 1727204568.97209: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204568.97212: Calling groups_plugins_play to load vars for managed-node2 46400 1727204568.98116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204568.99035: done with get_vars() 46400 1727204568.99051: variable 'ansible_search_path' from source: unknown 46400 1727204568.99052: variable 'ansible_search_path' from source: unknown 46400 1727204568.99082: we have included files to process 46400 1727204568.99083: generating all_blocks data 46400 1727204568.99085: done generating all_blocks data 46400 1727204568.99089: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204568.99090: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204568.99091: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204568.99214: done processing included file 46400 1727204568.99216: iterating over new_blocks loaded from include file 46400 1727204568.99217: in VariableManager get_vars() 46400 1727204568.99228: done with get_vars() 46400 1727204568.99229: filtering new block on tags 46400 1727204568.99247: done filtering new block on tags 46400 1727204568.99248: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed-node2 => (item=tasks/remove_profile.yml) 46400 1727204568.99252: extending task lists for all hosts with included blocks 46400 1727204568.99994: done extending task lists 46400 1727204568.99995: done processing included files 46400 1727204568.99996: results queue empty 46400 1727204568.99997: checking for any_errors_fatal 46400 1727204569.00002: done checking for any_errors_fatal 46400 1727204569.00003: checking for max_fail_percentage 46400 1727204569.00004: done checking for max_fail_percentage 46400 1727204569.00005: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.00006: done checking to see if all hosts have failed 46400 1727204569.00007: getting the remaining hosts for this loop 46400 1727204569.00008: done getting the remaining hosts for this loop 46400 1727204569.00011: getting the next task for host managed-node2 46400 1727204569.00016: done getting next task for host managed-node2 46400 1727204569.00018: ^ task is: TASK: Include network role 46400 1727204569.00020: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.00022: getting variables 46400 1727204569.00023: in VariableManager get_vars() 46400 1727204569.00031: Calling all_inventory to load vars for managed-node2 46400 1727204569.00033: Calling groups_inventory to load vars for managed-node2 46400 1727204569.00035: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.00051: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.00059: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.00065: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.08144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.11490: done with get_vars() 46400 1727204569.11529: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.178) 0:00:59.401 ***** 46400 1727204569.11733: entering _queue_task() for managed-node2/include_role 46400 1727204569.12309: worker is 1 (out of 1 available) 46400 1727204569.12321: exiting _queue_task() for managed-node2/include_role 46400 1727204569.12340: done queuing things up, now waiting for results queue to drain 46400 1727204569.12342: waiting for pending results... 46400 1727204569.12741: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204569.12880: in run() - task 0affcd87-79f5-1303-fda8-0000000013e8 46400 1727204569.12892: variable 'ansible_search_path' from source: unknown 46400 1727204569.12897: variable 'ansible_search_path' from source: unknown 46400 1727204569.13047: calling self._execute() 46400 1727204569.13188: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.13194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.13204: variable 'omit' from source: magic vars 46400 1727204569.14010: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.14021: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.14028: _execute() done 46400 1727204569.14032: dumping result to json 46400 1727204569.14035: done dumping result, returning 46400 1727204569.14042: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-0000000013e8] 46400 1727204569.14048: sending task result for task 0affcd87-79f5-1303-fda8-0000000013e8 46400 1727204569.14176: done sending task result for task 0affcd87-79f5-1303-fda8-0000000013e8 46400 1727204569.14180: WORKER PROCESS EXITING 46400 1727204569.14214: no more pending results, returning what we have 46400 1727204569.14220: in VariableManager get_vars() 46400 1727204569.14274: Calling all_inventory to load vars for managed-node2 46400 1727204569.14277: Calling groups_inventory to load vars for managed-node2 46400 1727204569.14282: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.14296: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.14300: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.14304: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.15496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.18160: done with get_vars() 46400 1727204569.18190: variable 'ansible_search_path' from source: unknown 46400 1727204569.18192: variable 'ansible_search_path' from source: unknown 46400 1727204569.18424: variable 'omit' from source: magic vars 46400 1727204569.18491: variable 'omit' from source: magic vars 46400 1727204569.18510: variable 'omit' from source: magic vars 46400 1727204569.18513: we have included files to process 46400 1727204569.18514: generating all_blocks data 46400 1727204569.18515: done generating all_blocks data 46400 1727204569.18516: processing included file: fedora.linux_system_roles.network 46400 1727204569.18557: in VariableManager get_vars() 46400 1727204569.18575: done with get_vars() 46400 1727204569.18627: in VariableManager get_vars() 46400 1727204569.18659: done with get_vars() 46400 1727204569.18711: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204569.18844: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204569.18949: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204569.19723: in VariableManager get_vars() 46400 1727204569.19746: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204569.22142: iterating over new_blocks loaded from include file 46400 1727204569.22144: in VariableManager get_vars() 46400 1727204569.22177: done with get_vars() 46400 1727204569.22179: filtering new block on tags 46400 1727204569.22599: done filtering new block on tags 46400 1727204569.22605: in VariableManager get_vars() 46400 1727204569.22631: done with get_vars() 46400 1727204569.22633: filtering new block on tags 46400 1727204569.22660: done filtering new block on tags 46400 1727204569.22665: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204569.22671: extending task lists for all hosts with included blocks 46400 1727204569.22832: done extending task lists 46400 1727204569.22833: done processing included files 46400 1727204569.22834: results queue empty 46400 1727204569.22835: checking for any_errors_fatal 46400 1727204569.22843: done checking for any_errors_fatal 46400 1727204569.22844: checking for max_fail_percentage 46400 1727204569.22845: done checking for max_fail_percentage 46400 1727204569.22847: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.22848: done checking to see if all hosts have failed 46400 1727204569.22849: getting the remaining hosts for this loop 46400 1727204569.22850: done getting the remaining hosts for this loop 46400 1727204569.22853: getting the next task for host managed-node2 46400 1727204569.22860: done getting next task for host managed-node2 46400 1727204569.22863: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204569.22870: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.22884: getting variables 46400 1727204569.22885: in VariableManager get_vars() 46400 1727204569.22908: Calling all_inventory to load vars for managed-node2 46400 1727204569.22913: Calling groups_inventory to load vars for managed-node2 46400 1727204569.22916: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.22924: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.22927: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.22930: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.24479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.26357: done with get_vars() 46400 1727204569.26396: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.147) 0:00:59.549 ***** 46400 1727204569.26490: entering _queue_task() for managed-node2/include_tasks 46400 1727204569.26777: worker is 1 (out of 1 available) 46400 1727204569.26790: exiting _queue_task() for managed-node2/include_tasks 46400 1727204569.26805: done queuing things up, now waiting for results queue to drain 46400 1727204569.26807: waiting for pending results... 46400 1727204569.26996: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204569.27095: in run() - task 0affcd87-79f5-1303-fda8-00000000145f 46400 1727204569.27106: variable 'ansible_search_path' from source: unknown 46400 1727204569.27111: variable 'ansible_search_path' from source: unknown 46400 1727204569.27141: calling self._execute() 46400 1727204569.27222: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.27226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.27235: variable 'omit' from source: magic vars 46400 1727204569.27521: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.27531: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.27538: _execute() done 46400 1727204569.27541: dumping result to json 46400 1727204569.27544: done dumping result, returning 46400 1727204569.27550: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-00000000145f] 46400 1727204569.27562: sending task result for task 0affcd87-79f5-1303-fda8-00000000145f 46400 1727204569.27648: done sending task result for task 0affcd87-79f5-1303-fda8-00000000145f 46400 1727204569.27651: WORKER PROCESS EXITING 46400 1727204569.27731: no more pending results, returning what we have 46400 1727204569.27735: in VariableManager get_vars() 46400 1727204569.27792: Calling all_inventory to load vars for managed-node2 46400 1727204569.27795: Calling groups_inventory to load vars for managed-node2 46400 1727204569.27797: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.27809: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.27812: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.27815: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.29149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.30773: done with get_vars() 46400 1727204569.30794: variable 'ansible_search_path' from source: unknown 46400 1727204569.30795: variable 'ansible_search_path' from source: unknown 46400 1727204569.30841: we have included files to process 46400 1727204569.30843: generating all_blocks data 46400 1727204569.30845: done generating all_blocks data 46400 1727204569.30849: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204569.30850: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204569.30851: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204569.31451: done processing included file 46400 1727204569.31453: iterating over new_blocks loaded from include file 46400 1727204569.31455: in VariableManager get_vars() 46400 1727204569.31485: done with get_vars() 46400 1727204569.31487: filtering new block on tags 46400 1727204569.31520: done filtering new block on tags 46400 1727204569.31524: in VariableManager get_vars() 46400 1727204569.31547: done with get_vars() 46400 1727204569.31548: filtering new block on tags 46400 1727204569.31600: done filtering new block on tags 46400 1727204569.31603: in VariableManager get_vars() 46400 1727204569.31626: done with get_vars() 46400 1727204569.31627: filtering new block on tags 46400 1727204569.31672: done filtering new block on tags 46400 1727204569.31675: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204569.31680: extending task lists for all hosts with included blocks 46400 1727204569.32996: done extending task lists 46400 1727204569.32997: done processing included files 46400 1727204569.32998: results queue empty 46400 1727204569.32998: checking for any_errors_fatal 46400 1727204569.33001: done checking for any_errors_fatal 46400 1727204569.33002: checking for max_fail_percentage 46400 1727204569.33002: done checking for max_fail_percentage 46400 1727204569.33003: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.33003: done checking to see if all hosts have failed 46400 1727204569.33004: getting the remaining hosts for this loop 46400 1727204569.33005: done getting the remaining hosts for this loop 46400 1727204569.33007: getting the next task for host managed-node2 46400 1727204569.33010: done getting next task for host managed-node2 46400 1727204569.33012: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204569.33015: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.33023: getting variables 46400 1727204569.33024: in VariableManager get_vars() 46400 1727204569.33036: Calling all_inventory to load vars for managed-node2 46400 1727204569.33038: Calling groups_inventory to load vars for managed-node2 46400 1727204569.33040: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.33044: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.33046: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.33047: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.33735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.35278: done with get_vars() 46400 1727204569.35305: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.088) 0:00:59.638 ***** 46400 1727204569.35375: entering _queue_task() for managed-node2/setup 46400 1727204569.35636: worker is 1 (out of 1 available) 46400 1727204569.35647: exiting _queue_task() for managed-node2/setup 46400 1727204569.35663: done queuing things up, now waiting for results queue to drain 46400 1727204569.35666: waiting for pending results... 46400 1727204569.35854: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204569.35952: in run() - task 0affcd87-79f5-1303-fda8-0000000014b6 46400 1727204569.35968: variable 'ansible_search_path' from source: unknown 46400 1727204569.35972: variable 'ansible_search_path' from source: unknown 46400 1727204569.36003: calling self._execute() 46400 1727204569.36082: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.36087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.36100: variable 'omit' from source: magic vars 46400 1727204569.36381: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.36390: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.36549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204569.38192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204569.38236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204569.38262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204569.38293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204569.38314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204569.38373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204569.38397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204569.38416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204569.38443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204569.38454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204569.38497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204569.38514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204569.38531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204569.38555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204569.38570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204569.38684: variable '__network_required_facts' from source: role '' defaults 46400 1727204569.38691: variable 'ansible_facts' from source: unknown 46400 1727204569.39160: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204569.39166: when evaluation is False, skipping this task 46400 1727204569.39169: _execute() done 46400 1727204569.39173: dumping result to json 46400 1727204569.39176: done dumping result, returning 46400 1727204569.39182: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-0000000014b6] 46400 1727204569.39188: sending task result for task 0affcd87-79f5-1303-fda8-0000000014b6 46400 1727204569.39278: done sending task result for task 0affcd87-79f5-1303-fda8-0000000014b6 46400 1727204569.39281: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204569.39326: no more pending results, returning what we have 46400 1727204569.39330: results queue empty 46400 1727204569.39331: checking for any_errors_fatal 46400 1727204569.39333: done checking for any_errors_fatal 46400 1727204569.39334: checking for max_fail_percentage 46400 1727204569.39336: done checking for max_fail_percentage 46400 1727204569.39337: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.39337: done checking to see if all hosts have failed 46400 1727204569.39338: getting the remaining hosts for this loop 46400 1727204569.39340: done getting the remaining hosts for this loop 46400 1727204569.39343: getting the next task for host managed-node2 46400 1727204569.39356: done getting next task for host managed-node2 46400 1727204569.39360: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204569.39367: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.39398: getting variables 46400 1727204569.39400: in VariableManager get_vars() 46400 1727204569.39442: Calling all_inventory to load vars for managed-node2 46400 1727204569.39444: Calling groups_inventory to load vars for managed-node2 46400 1727204569.39446: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.39457: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.39459: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.39477: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.40311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.41258: done with get_vars() 46400 1727204569.41280: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.059) 0:00:59.698 ***** 46400 1727204569.41357: entering _queue_task() for managed-node2/stat 46400 1727204569.41607: worker is 1 (out of 1 available) 46400 1727204569.41623: exiting _queue_task() for managed-node2/stat 46400 1727204569.41637: done queuing things up, now waiting for results queue to drain 46400 1727204569.41639: waiting for pending results... 46400 1727204569.41840: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204569.41942: in run() - task 0affcd87-79f5-1303-fda8-0000000014b8 46400 1727204569.41954: variable 'ansible_search_path' from source: unknown 46400 1727204569.41958: variable 'ansible_search_path' from source: unknown 46400 1727204569.41994: calling self._execute() 46400 1727204569.42073: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.42079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.42085: variable 'omit' from source: magic vars 46400 1727204569.42373: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.42383: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.42511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204569.42711: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204569.42745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204569.42777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204569.42802: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204569.42870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204569.42892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204569.42910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204569.42928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204569.43000: variable '__network_is_ostree' from source: set_fact 46400 1727204569.43006: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204569.43009: when evaluation is False, skipping this task 46400 1727204569.43011: _execute() done 46400 1727204569.43013: dumping result to json 46400 1727204569.43017: done dumping result, returning 46400 1727204569.43024: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-0000000014b8] 46400 1727204569.43030: sending task result for task 0affcd87-79f5-1303-fda8-0000000014b8 46400 1727204569.43121: done sending task result for task 0affcd87-79f5-1303-fda8-0000000014b8 46400 1727204569.43124: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204569.43177: no more pending results, returning what we have 46400 1727204569.43181: results queue empty 46400 1727204569.43182: checking for any_errors_fatal 46400 1727204569.43191: done checking for any_errors_fatal 46400 1727204569.43192: checking for max_fail_percentage 46400 1727204569.43194: done checking for max_fail_percentage 46400 1727204569.43195: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.43196: done checking to see if all hosts have failed 46400 1727204569.43196: getting the remaining hosts for this loop 46400 1727204569.43198: done getting the remaining hosts for this loop 46400 1727204569.43201: getting the next task for host managed-node2 46400 1727204569.43210: done getting next task for host managed-node2 46400 1727204569.43214: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204569.43220: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.43250: getting variables 46400 1727204569.43252: in VariableManager get_vars() 46400 1727204569.43291: Calling all_inventory to load vars for managed-node2 46400 1727204569.43294: Calling groups_inventory to load vars for managed-node2 46400 1727204569.43296: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.43305: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.43307: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.43310: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.44305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.45221: done with get_vars() 46400 1727204569.45241: done getting variables 46400 1727204569.45289: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.039) 0:00:59.737 ***** 46400 1727204569.45320: entering _queue_task() for managed-node2/set_fact 46400 1727204569.45571: worker is 1 (out of 1 available) 46400 1727204569.45588: exiting _queue_task() for managed-node2/set_fact 46400 1727204569.45602: done queuing things up, now waiting for results queue to drain 46400 1727204569.45604: waiting for pending results... 46400 1727204569.45798: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204569.45908: in run() - task 0affcd87-79f5-1303-fda8-0000000014b9 46400 1727204569.45918: variable 'ansible_search_path' from source: unknown 46400 1727204569.45922: variable 'ansible_search_path' from source: unknown 46400 1727204569.45952: calling self._execute() 46400 1727204569.46025: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.46029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.46039: variable 'omit' from source: magic vars 46400 1727204569.46321: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.46331: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.46451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204569.46652: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204569.46693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204569.46719: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204569.46745: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204569.46810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204569.46829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204569.46850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204569.46872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204569.46937: variable '__network_is_ostree' from source: set_fact 46400 1727204569.46942: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204569.46946: when evaluation is False, skipping this task 46400 1727204569.46948: _execute() done 46400 1727204569.46951: dumping result to json 46400 1727204569.46953: done dumping result, returning 46400 1727204569.46961: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-0000000014b9] 46400 1727204569.46970: sending task result for task 0affcd87-79f5-1303-fda8-0000000014b9 46400 1727204569.47053: done sending task result for task 0affcd87-79f5-1303-fda8-0000000014b9 46400 1727204569.47056: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204569.47105: no more pending results, returning what we have 46400 1727204569.47110: results queue empty 46400 1727204569.47111: checking for any_errors_fatal 46400 1727204569.47117: done checking for any_errors_fatal 46400 1727204569.47118: checking for max_fail_percentage 46400 1727204569.47119: done checking for max_fail_percentage 46400 1727204569.47120: checking to see if all hosts have failed and the running result is not ok 46400 1727204569.47121: done checking to see if all hosts have failed 46400 1727204569.47122: getting the remaining hosts for this loop 46400 1727204569.47123: done getting the remaining hosts for this loop 46400 1727204569.47127: getting the next task for host managed-node2 46400 1727204569.47139: done getting next task for host managed-node2 46400 1727204569.47143: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204569.47149: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204569.47173: getting variables 46400 1727204569.47179: in VariableManager get_vars() 46400 1727204569.47218: Calling all_inventory to load vars for managed-node2 46400 1727204569.47221: Calling groups_inventory to load vars for managed-node2 46400 1727204569.47223: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204569.47232: Calling all_plugins_play to load vars for managed-node2 46400 1727204569.47234: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204569.47237: Calling groups_plugins_play to load vars for managed-node2 46400 1727204569.48053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204569.48989: done with get_vars() 46400 1727204569.49005: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:49 -0400 (0:00:00.037) 0:00:59.775 ***** 46400 1727204569.49080: entering _queue_task() for managed-node2/service_facts 46400 1727204569.49307: worker is 1 (out of 1 available) 46400 1727204569.49321: exiting _queue_task() for managed-node2/service_facts 46400 1727204569.49335: done queuing things up, now waiting for results queue to drain 46400 1727204569.49337: waiting for pending results... 46400 1727204569.49526: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204569.49628: in run() - task 0affcd87-79f5-1303-fda8-0000000014bb 46400 1727204569.49640: variable 'ansible_search_path' from source: unknown 46400 1727204569.49644: variable 'ansible_search_path' from source: unknown 46400 1727204569.49675: calling self._execute() 46400 1727204569.49755: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.49762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.49771: variable 'omit' from source: magic vars 46400 1727204569.50046: variable 'ansible_distribution_major_version' from source: facts 46400 1727204569.50056: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204569.50066: variable 'omit' from source: magic vars 46400 1727204569.50209: variable 'omit' from source: magic vars 46400 1727204569.50213: variable 'omit' from source: magic vars 46400 1727204569.50215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204569.50234: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204569.50257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204569.50278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204569.50289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204569.50317: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204569.50321: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.50325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.50531: Set connection var ansible_shell_type to sh 46400 1727204569.50550: Set connection var ansible_shell_executable to /bin/sh 46400 1727204569.50563: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204569.50575: Set connection var ansible_connection to ssh 46400 1727204569.50585: Set connection var ansible_pipelining to False 46400 1727204569.50595: Set connection var ansible_timeout to 10 46400 1727204569.50631: variable 'ansible_shell_executable' from source: unknown 46400 1727204569.50639: variable 'ansible_connection' from source: unknown 46400 1727204569.50647: variable 'ansible_module_compression' from source: unknown 46400 1727204569.50658: variable 'ansible_shell_type' from source: unknown 46400 1727204569.50671: variable 'ansible_shell_executable' from source: unknown 46400 1727204569.50679: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204569.50687: variable 'ansible_pipelining' from source: unknown 46400 1727204569.50694: variable 'ansible_timeout' from source: unknown 46400 1727204569.50701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204569.50937: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204569.50965: variable 'omit' from source: magic vars 46400 1727204569.50976: starting attempt loop 46400 1727204569.50986: running the handler 46400 1727204569.51006: _low_level_execute_command(): starting 46400 1727204569.51019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204569.51876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.51908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204569.51911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.51914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.51917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.51982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204569.51988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204569.52055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204569.53702: stdout chunk (state=3): >>>/root <<< 46400 1727204569.53807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204569.53860: stderr chunk (state=3): >>><<< 46400 1727204569.53864: stdout chunk (state=3): >>><<< 46400 1727204569.53889: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204569.53902: _low_level_execute_command(): starting 46400 1727204569.53909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677 `" && echo ansible-tmp-1727204569.5388842-50466-184690445262677="` echo /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677 `" ) && sleep 0' 46400 1727204569.54378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204569.54381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204569.54420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.54432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.54435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.54475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204569.54487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204569.54534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204569.56409: stdout chunk (state=3): >>>ansible-tmp-1727204569.5388842-50466-184690445262677=/root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677 <<< 46400 1727204569.56517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204569.56582: stderr chunk (state=3): >>><<< 46400 1727204569.56585: stdout chunk (state=3): >>><<< 46400 1727204569.56598: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204569.5388842-50466-184690445262677=/root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204569.56640: variable 'ansible_module_compression' from source: unknown 46400 1727204569.56682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204569.56715: variable 'ansible_facts' from source: unknown 46400 1727204569.56777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/AnsiballZ_service_facts.py 46400 1727204569.56891: Sending initial data 46400 1727204569.56894: Sent initial data (162 bytes) 46400 1727204569.57579: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204569.57585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.57596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204569.57641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204569.57645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.57647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.57710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204569.57713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204569.57718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204569.57757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204569.59482: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204569.59516: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204569.59554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpqz7jvkox /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/AnsiballZ_service_facts.py <<< 46400 1727204569.59590: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204569.60403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204569.60519: stderr chunk (state=3): >>><<< 46400 1727204569.60522: stdout chunk (state=3): >>><<< 46400 1727204569.60536: done transferring module to remote 46400 1727204569.60546: _low_level_execute_command(): starting 46400 1727204569.60551: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/ /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/AnsiballZ_service_facts.py && sleep 0' 46400 1727204569.61024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204569.61030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204569.61063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.61079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204569.61089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.61136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204569.61148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204569.61202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204569.62939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204569.63001: stderr chunk (state=3): >>><<< 46400 1727204569.63006: stdout chunk (state=3): >>><<< 46400 1727204569.63021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204569.63024: _low_level_execute_command(): starting 46400 1727204569.63030: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/AnsiballZ_service_facts.py && sleep 0' 46400 1727204569.63504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204569.63517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204569.63538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204569.63554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204569.63610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204569.63616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204569.63680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204570.92404: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 46400 1727204570.92441: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 46400 1727204570.92506: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204570.93802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204570.93806: stdout chunk (state=3): >>><<< 46400 1727204570.93808: stderr chunk (state=3): >>><<< 46400 1727204570.93978: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204570.94924: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204570.94940: _low_level_execute_command(): starting 46400 1727204570.94955: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204569.5388842-50466-184690445262677/ > /dev/null 2>&1 && sleep 0' 46400 1727204570.95810: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204570.95883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204570.95899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204570.95916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204570.95963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204570.96001: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204570.96014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204570.96030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204570.96046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204570.96056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204570.96070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204570.96084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204570.96111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204570.96123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204570.96140: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204570.96157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204570.96282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204570.96306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204570.96327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204570.96397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204570.98267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204570.98271: stdout chunk (state=3): >>><<< 46400 1727204570.98274: stderr chunk (state=3): >>><<< 46400 1727204570.98370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204570.98374: handler run complete 46400 1727204570.98497: variable 'ansible_facts' from source: unknown 46400 1727204570.98669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204570.99109: variable 'ansible_facts' from source: unknown 46400 1727204570.99242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204570.99501: attempt loop complete, returning result 46400 1727204570.99511: _execute() done 46400 1727204570.99519: dumping result to json 46400 1727204570.99584: done dumping result, returning 46400 1727204570.99604: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-0000000014bb] 46400 1727204570.99614: sending task result for task 0affcd87-79f5-1303-fda8-0000000014bb ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204571.00829: no more pending results, returning what we have 46400 1727204571.00833: results queue empty 46400 1727204571.00834: checking for any_errors_fatal 46400 1727204571.00839: done checking for any_errors_fatal 46400 1727204571.00840: checking for max_fail_percentage 46400 1727204571.00842: done checking for max_fail_percentage 46400 1727204571.00843: checking to see if all hosts have failed and the running result is not ok 46400 1727204571.00844: done checking to see if all hosts have failed 46400 1727204571.00845: getting the remaining hosts for this loop 46400 1727204571.00846: done getting the remaining hosts for this loop 46400 1727204571.00850: getting the next task for host managed-node2 46400 1727204571.00858: done getting next task for host managed-node2 46400 1727204571.00863: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204571.00872: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204571.00884: getting variables 46400 1727204571.00886: in VariableManager get_vars() 46400 1727204571.00923: Calling all_inventory to load vars for managed-node2 46400 1727204571.00926: Calling groups_inventory to load vars for managed-node2 46400 1727204571.00928: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204571.00939: Calling all_plugins_play to load vars for managed-node2 46400 1727204571.00942: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204571.00945: Calling groups_plugins_play to load vars for managed-node2 46400 1727204571.02017: done sending task result for task 0affcd87-79f5-1303-fda8-0000000014bb 46400 1727204571.02021: WORKER PROCESS EXITING 46400 1727204571.02823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204571.05180: done with get_vars() 46400 1727204571.05215: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:51 -0400 (0:00:01.562) 0:01:01.337 ***** 46400 1727204571.05330: entering _queue_task() for managed-node2/package_facts 46400 1727204571.05945: worker is 1 (out of 1 available) 46400 1727204571.05958: exiting _queue_task() for managed-node2/package_facts 46400 1727204571.05976: done queuing things up, now waiting for results queue to drain 46400 1727204571.05978: waiting for pending results... 46400 1727204571.06659: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204571.07015: in run() - task 0affcd87-79f5-1303-fda8-0000000014bc 46400 1727204571.07037: variable 'ansible_search_path' from source: unknown 46400 1727204571.07047: variable 'ansible_search_path' from source: unknown 46400 1727204571.07098: calling self._execute() 46400 1727204571.07206: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204571.07222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204571.07241: variable 'omit' from source: magic vars 46400 1727204571.07730: variable 'ansible_distribution_major_version' from source: facts 46400 1727204571.07748: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204571.07758: variable 'omit' from source: magic vars 46400 1727204571.07847: variable 'omit' from source: magic vars 46400 1727204571.07891: variable 'omit' from source: magic vars 46400 1727204571.07939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204571.07984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204571.08010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204571.08085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204571.08103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204571.08137: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204571.08300: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204571.08310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204571.08421: Set connection var ansible_shell_type to sh 46400 1727204571.08436: Set connection var ansible_shell_executable to /bin/sh 46400 1727204571.08447: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204571.08457: Set connection var ansible_connection to ssh 46400 1727204571.08472: Set connection var ansible_pipelining to False 46400 1727204571.08482: Set connection var ansible_timeout to 10 46400 1727204571.08510: variable 'ansible_shell_executable' from source: unknown 46400 1727204571.08517: variable 'ansible_connection' from source: unknown 46400 1727204571.08524: variable 'ansible_module_compression' from source: unknown 46400 1727204571.08532: variable 'ansible_shell_type' from source: unknown 46400 1727204571.08538: variable 'ansible_shell_executable' from source: unknown 46400 1727204571.08545: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204571.08552: variable 'ansible_pipelining' from source: unknown 46400 1727204571.08563: variable 'ansible_timeout' from source: unknown 46400 1727204571.08574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204571.08780: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204571.08968: variable 'omit' from source: magic vars 46400 1727204571.08980: starting attempt loop 46400 1727204571.08988: running the handler 46400 1727204571.09007: _low_level_execute_command(): starting 46400 1727204571.09019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204571.09876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.09893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.09909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.09928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.09976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.09989: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.10004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.10023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.10034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.10047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.10062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.10086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.10102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.10115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.10127: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.10140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.10224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.10241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.10255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.10370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.12022: stdout chunk (state=3): >>>/root <<< 46400 1727204571.12188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204571.12238: stderr chunk (state=3): >>><<< 46400 1727204571.12242: stdout chunk (state=3): >>><<< 46400 1727204571.12322: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204571.12326: _low_level_execute_command(): starting 46400 1727204571.12330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860 `" && echo ansible-tmp-1727204571.122673-50609-92588992205860="` echo /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860 `" ) && sleep 0' 46400 1727204571.13193: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.13207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.13231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.13251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.13295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.13308: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.13327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.13349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.13362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.13377: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.13391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.13405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.13421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.13438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.13454: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.13471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.13551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.13576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.13592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.13681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.15535: stdout chunk (state=3): >>>ansible-tmp-1727204571.122673-50609-92588992205860=/root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860 <<< 46400 1727204571.15745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204571.15749: stdout chunk (state=3): >>><<< 46400 1727204571.15752: stderr chunk (state=3): >>><<< 46400 1727204571.16075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204571.122673-50609-92588992205860=/root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204571.16079: variable 'ansible_module_compression' from source: unknown 46400 1727204571.16082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204571.16085: variable 'ansible_facts' from source: unknown 46400 1727204571.16149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/AnsiballZ_package_facts.py 46400 1727204571.16923: Sending initial data 46400 1727204571.16927: Sent initial data (160 bytes) 46400 1727204571.18976: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.19034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.19052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.19074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.19122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.19251: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.19269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.19289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.19302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.19314: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.19329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.19350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.19370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.19383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.19395: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.19410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.19494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.19583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.19598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.19797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.21504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204571.21510: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204571.21550: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpvjhg3_7y /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/AnsiballZ_package_facts.py <<< 46400 1727204571.21584: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204571.24771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204571.24855: stderr chunk (state=3): >>><<< 46400 1727204571.24859: stdout chunk (state=3): >>><<< 46400 1727204571.24886: done transferring module to remote 46400 1727204571.24895: _low_level_execute_command(): starting 46400 1727204571.24902: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/ /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/AnsiballZ_package_facts.py && sleep 0' 46400 1727204571.25834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.25855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.25873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.25899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.25943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.25959: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.25980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.26004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.26020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.26031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.26044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.26058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.26080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.26094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.26108: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.26126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.26238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.26256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.26273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.26349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.28178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204571.28182: stdout chunk (state=3): >>><<< 46400 1727204571.28185: stderr chunk (state=3): >>><<< 46400 1727204571.28270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204571.28275: _low_level_execute_command(): starting 46400 1727204571.28279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/AnsiballZ_package_facts.py && sleep 0' 46400 1727204571.29307: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.29323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.29338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.29357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.29413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.29427: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.29442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.29461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.29476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.29492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.29511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.29526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.29542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.29557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.29573: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.29588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.29670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.29688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.29705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.29791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.76445: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204571.76485: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 46400 1727204571.76519: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 46400 1727204571.76542: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204571.76606: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 46400 1727204571.76610: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204571.76618: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204571.76622: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204571.76670: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204571.76676: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204571.76709: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204571.76718: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy"<<< 46400 1727204571.76721: stdout chunk (state=3): >>>: "first"}}} <<< 46400 1727204571.78256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204571.78260: stdout chunk (state=3): >>><<< 46400 1727204571.78271: stderr chunk (state=3): >>><<< 46400 1727204571.78315: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204571.85778: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204571.85796: _low_level_execute_command(): starting 46400 1727204571.85800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204571.122673-50609-92588992205860/ > /dev/null 2>&1 && sleep 0' 46400 1727204571.86437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204571.86446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.86458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.86473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.86514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.86520: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204571.86530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.86544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204571.86551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204571.86558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204571.86569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204571.86579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204571.86590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204571.86598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204571.86607: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204571.86614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204571.86689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204571.86705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204571.86718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204571.86785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204571.88686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204571.88715: stdout chunk (state=3): >>><<< 46400 1727204571.88734: stderr chunk (state=3): >>><<< 46400 1727204571.88823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204571.88831: handler run complete 46400 1727204571.90539: variable 'ansible_facts' from source: unknown 46400 1727204571.91134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204571.94088: variable 'ansible_facts' from source: unknown 46400 1727204571.94958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204571.97426: attempt loop complete, returning result 46400 1727204571.97445: _execute() done 46400 1727204571.97449: dumping result to json 46400 1727204571.97713: done dumping result, returning 46400 1727204571.97723: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-0000000014bc] 46400 1727204571.97730: sending task result for task 0affcd87-79f5-1303-fda8-0000000014bc 46400 1727204572.01146: done sending task result for task 0affcd87-79f5-1303-fda8-0000000014bc 46400 1727204572.01149: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204572.01341: no more pending results, returning what we have 46400 1727204572.01344: results queue empty 46400 1727204572.01345: checking for any_errors_fatal 46400 1727204572.01350: done checking for any_errors_fatal 46400 1727204572.01351: checking for max_fail_percentage 46400 1727204572.01352: done checking for max_fail_percentage 46400 1727204572.01353: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.01354: done checking to see if all hosts have failed 46400 1727204572.01355: getting the remaining hosts for this loop 46400 1727204572.01356: done getting the remaining hosts for this loop 46400 1727204572.01359: getting the next task for host managed-node2 46400 1727204572.01394: done getting next task for host managed-node2 46400 1727204572.01399: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204572.01404: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.01420: getting variables 46400 1727204572.01422: in VariableManager get_vars() 46400 1727204572.01455: Calling all_inventory to load vars for managed-node2 46400 1727204572.01458: Calling groups_inventory to load vars for managed-node2 46400 1727204572.01468: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.01499: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.01504: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.01508: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.05965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.09148: done with get_vars() 46400 1727204572.09196: done getting variables 46400 1727204572.09268: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:52 -0400 (0:00:01.039) 0:01:02.377 ***** 46400 1727204572.09314: entering _queue_task() for managed-node2/debug 46400 1727204572.09706: worker is 1 (out of 1 available) 46400 1727204572.09722: exiting _queue_task() for managed-node2/debug 46400 1727204572.09738: done queuing things up, now waiting for results queue to drain 46400 1727204572.09740: waiting for pending results... 46400 1727204572.10490: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204572.10689: in run() - task 0affcd87-79f5-1303-fda8-000000001460 46400 1727204572.10712: variable 'ansible_search_path' from source: unknown 46400 1727204572.10721: variable 'ansible_search_path' from source: unknown 46400 1727204572.10775: calling self._execute() 46400 1727204572.10896: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.10909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.10924: variable 'omit' from source: magic vars 46400 1727204572.11342: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.11362: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.11376: variable 'omit' from source: magic vars 46400 1727204572.11451: variable 'omit' from source: magic vars 46400 1727204572.11572: variable 'network_provider' from source: set_fact 46400 1727204572.11599: variable 'omit' from source: magic vars 46400 1727204572.11677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204572.11721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204572.11837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204572.11872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204572.11895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204572.11933: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204572.11943: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.11955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.12223: Set connection var ansible_shell_type to sh 46400 1727204572.12241: Set connection var ansible_shell_executable to /bin/sh 46400 1727204572.12251: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204572.12265: Set connection var ansible_connection to ssh 46400 1727204572.12277: Set connection var ansible_pipelining to False 46400 1727204572.12288: Set connection var ansible_timeout to 10 46400 1727204572.12344: variable 'ansible_shell_executable' from source: unknown 46400 1727204572.12354: variable 'ansible_connection' from source: unknown 46400 1727204572.12366: variable 'ansible_module_compression' from source: unknown 46400 1727204572.12374: variable 'ansible_shell_type' from source: unknown 46400 1727204572.12381: variable 'ansible_shell_executable' from source: unknown 46400 1727204572.12386: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.12393: variable 'ansible_pipelining' from source: unknown 46400 1727204572.12398: variable 'ansible_timeout' from source: unknown 46400 1727204572.12406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.12585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204572.12603: variable 'omit' from source: magic vars 46400 1727204572.12614: starting attempt loop 46400 1727204572.12621: running the handler 46400 1727204572.12689: handler run complete 46400 1727204572.12711: attempt loop complete, returning result 46400 1727204572.12718: _execute() done 46400 1727204572.12725: dumping result to json 46400 1727204572.12731: done dumping result, returning 46400 1727204572.12747: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000001460] 46400 1727204572.12756: sending task result for task 0affcd87-79f5-1303-fda8-000000001460 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204572.12949: no more pending results, returning what we have 46400 1727204572.12953: results queue empty 46400 1727204572.12954: checking for any_errors_fatal 46400 1727204572.12969: done checking for any_errors_fatal 46400 1727204572.12970: checking for max_fail_percentage 46400 1727204572.12972: done checking for max_fail_percentage 46400 1727204572.12973: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.12974: done checking to see if all hosts have failed 46400 1727204572.12975: getting the remaining hosts for this loop 46400 1727204572.12977: done getting the remaining hosts for this loop 46400 1727204572.12982: getting the next task for host managed-node2 46400 1727204572.12993: done getting next task for host managed-node2 46400 1727204572.12998: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204572.13005: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.13018: getting variables 46400 1727204572.13020: in VariableManager get_vars() 46400 1727204572.13070: Calling all_inventory to load vars for managed-node2 46400 1727204572.13073: Calling groups_inventory to load vars for managed-node2 46400 1727204572.13075: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.13087: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.13090: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.13093: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.14236: done sending task result for task 0affcd87-79f5-1303-fda8-000000001460 46400 1727204572.14244: WORKER PROCESS EXITING 46400 1727204572.18522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.20590: done with get_vars() 46400 1727204572.20636: done getting variables 46400 1727204572.20705: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.114) 0:01:02.492 ***** 46400 1727204572.20755: entering _queue_task() for managed-node2/fail 46400 1727204572.21744: worker is 1 (out of 1 available) 46400 1727204572.21758: exiting _queue_task() for managed-node2/fail 46400 1727204572.21776: done queuing things up, now waiting for results queue to drain 46400 1727204572.21778: waiting for pending results... 46400 1727204572.22697: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204572.23088: in run() - task 0affcd87-79f5-1303-fda8-000000001461 46400 1727204572.23110: variable 'ansible_search_path' from source: unknown 46400 1727204572.23118: variable 'ansible_search_path' from source: unknown 46400 1727204572.23159: calling self._execute() 46400 1727204572.23535: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.23548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.23562: variable 'omit' from source: magic vars 46400 1727204572.24013: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.24032: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.24213: variable 'network_state' from source: role '' defaults 46400 1727204572.24230: Evaluated conditional (network_state != {}): False 46400 1727204572.24237: when evaluation is False, skipping this task 46400 1727204572.24244: _execute() done 46400 1727204572.24251: dumping result to json 46400 1727204572.24258: done dumping result, returning 46400 1727204572.24271: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000001461] 46400 1727204572.24289: sending task result for task 0affcd87-79f5-1303-fda8-000000001461 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204572.24449: no more pending results, returning what we have 46400 1727204572.24454: results queue empty 46400 1727204572.24455: checking for any_errors_fatal 46400 1727204572.24462: done checking for any_errors_fatal 46400 1727204572.24463: checking for max_fail_percentage 46400 1727204572.24467: done checking for max_fail_percentage 46400 1727204572.24469: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.24469: done checking to see if all hosts have failed 46400 1727204572.24470: getting the remaining hosts for this loop 46400 1727204572.24472: done getting the remaining hosts for this loop 46400 1727204572.24476: getting the next task for host managed-node2 46400 1727204572.24487: done getting next task for host managed-node2 46400 1727204572.24492: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204572.24499: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.24529: getting variables 46400 1727204572.24531: in VariableManager get_vars() 46400 1727204572.24593: Calling all_inventory to load vars for managed-node2 46400 1727204572.24597: Calling groups_inventory to load vars for managed-node2 46400 1727204572.24602: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.24620: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.24624: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.24629: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.25791: done sending task result for task 0affcd87-79f5-1303-fda8-000000001461 46400 1727204572.25795: WORKER PROCESS EXITING 46400 1727204572.27322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.30709: done with get_vars() 46400 1727204572.30748: done getting variables 46400 1727204572.30817: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.101) 0:01:02.593 ***** 46400 1727204572.30875: entering _queue_task() for managed-node2/fail 46400 1727204572.31285: worker is 1 (out of 1 available) 46400 1727204572.31299: exiting _queue_task() for managed-node2/fail 46400 1727204572.31318: done queuing things up, now waiting for results queue to drain 46400 1727204572.31319: waiting for pending results... 46400 1727204572.31947: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204572.32177: in run() - task 0affcd87-79f5-1303-fda8-000000001462 46400 1727204572.32224: variable 'ansible_search_path' from source: unknown 46400 1727204572.32236: variable 'ansible_search_path' from source: unknown 46400 1727204572.32299: calling self._execute() 46400 1727204572.32458: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.32499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.32520: variable 'omit' from source: magic vars 46400 1727204572.33047: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.33070: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.33272: variable 'network_state' from source: role '' defaults 46400 1727204572.33288: Evaluated conditional (network_state != {}): False 46400 1727204572.33295: when evaluation is False, skipping this task 46400 1727204572.33302: _execute() done 46400 1727204572.33313: dumping result to json 46400 1727204572.33337: done dumping result, returning 46400 1727204572.33348: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000001462] 46400 1727204572.33358: sending task result for task 0affcd87-79f5-1303-fda8-000000001462 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204572.33545: no more pending results, returning what we have 46400 1727204572.33551: results queue empty 46400 1727204572.33552: checking for any_errors_fatal 46400 1727204572.33562: done checking for any_errors_fatal 46400 1727204572.33563: checking for max_fail_percentage 46400 1727204572.33566: done checking for max_fail_percentage 46400 1727204572.33570: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.33571: done checking to see if all hosts have failed 46400 1727204572.33572: getting the remaining hosts for this loop 46400 1727204572.33573: done getting the remaining hosts for this loop 46400 1727204572.33580: getting the next task for host managed-node2 46400 1727204572.33591: done getting next task for host managed-node2 46400 1727204572.33597: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204572.33603: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.33638: getting variables 46400 1727204572.33641: in VariableManager get_vars() 46400 1727204572.33702: Calling all_inventory to load vars for managed-node2 46400 1727204572.33708: Calling groups_inventory to load vars for managed-node2 46400 1727204572.33711: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.33729: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.33732: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.33736: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.35042: done sending task result for task 0affcd87-79f5-1303-fda8-000000001462 46400 1727204572.35046: WORKER PROCESS EXITING 46400 1727204572.36866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.41521: done with get_vars() 46400 1727204572.41558: done getting variables 46400 1727204572.41625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.107) 0:01:02.701 ***** 46400 1727204572.41670: entering _queue_task() for managed-node2/fail 46400 1727204572.42022: worker is 1 (out of 1 available) 46400 1727204572.42035: exiting _queue_task() for managed-node2/fail 46400 1727204572.42050: done queuing things up, now waiting for results queue to drain 46400 1727204572.42052: waiting for pending results... 46400 1727204572.42359: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204572.42516: in run() - task 0affcd87-79f5-1303-fda8-000000001463 46400 1727204572.42541: variable 'ansible_search_path' from source: unknown 46400 1727204572.42549: variable 'ansible_search_path' from source: unknown 46400 1727204572.42591: calling self._execute() 46400 1727204572.42701: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.42716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.42730: variable 'omit' from source: magic vars 46400 1727204572.43132: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.43266: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.43476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204572.48265: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204572.48508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204572.48575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204572.48614: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204572.48647: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204572.48737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.48780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.48812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.48852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.48875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.48995: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.49023: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204572.49032: when evaluation is False, skipping this task 46400 1727204572.49039: _execute() done 46400 1727204572.49046: dumping result to json 46400 1727204572.49053: done dumping result, returning 46400 1727204572.49067: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000001463] 46400 1727204572.49083: sending task result for task 0affcd87-79f5-1303-fda8-000000001463 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204572.49252: no more pending results, returning what we have 46400 1727204572.49257: results queue empty 46400 1727204572.49259: checking for any_errors_fatal 46400 1727204572.49270: done checking for any_errors_fatal 46400 1727204572.49271: checking for max_fail_percentage 46400 1727204572.49274: done checking for max_fail_percentage 46400 1727204572.49275: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.49276: done checking to see if all hosts have failed 46400 1727204572.49277: getting the remaining hosts for this loop 46400 1727204572.49279: done getting the remaining hosts for this loop 46400 1727204572.49284: getting the next task for host managed-node2 46400 1727204572.49295: done getting next task for host managed-node2 46400 1727204572.49301: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204572.49306: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.49337: getting variables 46400 1727204572.49340: in VariableManager get_vars() 46400 1727204572.49390: Calling all_inventory to load vars for managed-node2 46400 1727204572.49394: Calling groups_inventory to load vars for managed-node2 46400 1727204572.49396: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.49408: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.49411: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.49414: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.50387: done sending task result for task 0affcd87-79f5-1303-fda8-000000001463 46400 1727204572.50391: WORKER PROCESS EXITING 46400 1727204572.51269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.53917: done with get_vars() 46400 1727204572.53970: done getting variables 46400 1727204572.54040: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.124) 0:01:02.825 ***** 46400 1727204572.54089: entering _queue_task() for managed-node2/dnf 46400 1727204572.54458: worker is 1 (out of 1 available) 46400 1727204572.54473: exiting _queue_task() for managed-node2/dnf 46400 1727204572.54486: done queuing things up, now waiting for results queue to drain 46400 1727204572.54488: waiting for pending results... 46400 1727204572.54807: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204572.54958: in run() - task 0affcd87-79f5-1303-fda8-000000001464 46400 1727204572.54988: variable 'ansible_search_path' from source: unknown 46400 1727204572.54997: variable 'ansible_search_path' from source: unknown 46400 1727204572.55042: calling self._execute() 46400 1727204572.55146: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.55161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.55180: variable 'omit' from source: magic vars 46400 1727204572.55611: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.55633: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.55957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204572.58576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204572.58651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204572.58696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204572.58731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204572.58759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204572.58841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.59177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.59224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.59243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.59258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.59382: variable 'ansible_distribution' from source: facts 46400 1727204572.59411: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.59414: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204572.59825: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204572.59959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.59988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.60012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.60054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.60073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.60113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.60136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.60159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.60490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.60505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.60544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.60572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.60608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.60653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.60661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.60831: variable 'network_connections' from source: include params 46400 1727204572.60846: variable 'interface' from source: play vars 46400 1727204572.60916: variable 'interface' from source: play vars 46400 1727204572.60993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204572.61178: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204572.61216: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204572.61250: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204572.61284: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204572.61347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204572.61375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204572.61402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.61428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204572.61480: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204572.61728: variable 'network_connections' from source: include params 46400 1727204572.61734: variable 'interface' from source: play vars 46400 1727204572.61801: variable 'interface' from source: play vars 46400 1727204572.61826: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204572.61829: when evaluation is False, skipping this task 46400 1727204572.61832: _execute() done 46400 1727204572.61834: dumping result to json 46400 1727204572.61836: done dumping result, returning 46400 1727204572.61847: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001464] 46400 1727204572.61857: sending task result for task 0affcd87-79f5-1303-fda8-000000001464 46400 1727204572.61955: done sending task result for task 0affcd87-79f5-1303-fda8-000000001464 46400 1727204572.61960: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204572.62016: no more pending results, returning what we have 46400 1727204572.62020: results queue empty 46400 1727204572.62021: checking for any_errors_fatal 46400 1727204572.62028: done checking for any_errors_fatal 46400 1727204572.62029: checking for max_fail_percentage 46400 1727204572.62031: done checking for max_fail_percentage 46400 1727204572.62031: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.62032: done checking to see if all hosts have failed 46400 1727204572.62033: getting the remaining hosts for this loop 46400 1727204572.62035: done getting the remaining hosts for this loop 46400 1727204572.62040: getting the next task for host managed-node2 46400 1727204572.62049: done getting next task for host managed-node2 46400 1727204572.62053: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204572.62058: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.62085: getting variables 46400 1727204572.62087: in VariableManager get_vars() 46400 1727204572.62128: Calling all_inventory to load vars for managed-node2 46400 1727204572.62131: Calling groups_inventory to load vars for managed-node2 46400 1727204572.62133: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.62142: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.62145: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.62147: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.64405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.66497: done with get_vars() 46400 1727204572.66549: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204572.66633: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.125) 0:01:02.951 ***** 46400 1727204572.66682: entering _queue_task() for managed-node2/yum 46400 1727204572.67061: worker is 1 (out of 1 available) 46400 1727204572.67081: exiting _queue_task() for managed-node2/yum 46400 1727204572.67100: done queuing things up, now waiting for results queue to drain 46400 1727204572.67101: waiting for pending results... 46400 1727204572.67435: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204572.67591: in run() - task 0affcd87-79f5-1303-fda8-000000001465 46400 1727204572.67609: variable 'ansible_search_path' from source: unknown 46400 1727204572.67618: variable 'ansible_search_path' from source: unknown 46400 1727204572.67673: calling self._execute() 46400 1727204572.67795: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.67806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.67819: variable 'omit' from source: magic vars 46400 1727204572.68241: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.68258: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204572.68467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204572.74948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204572.75377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204572.75542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204572.75592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204572.75654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204572.75754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204572.75791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204572.75820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204572.75878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204572.75897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204572.76009: variable 'ansible_distribution_major_version' from source: facts 46400 1727204572.76029: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204572.76036: when evaluation is False, skipping this task 46400 1727204572.76043: _execute() done 46400 1727204572.76065: dumping result to json 46400 1727204572.76075: done dumping result, returning 46400 1727204572.76086: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001465] 46400 1727204572.76095: sending task result for task 0affcd87-79f5-1303-fda8-000000001465 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204572.76250: no more pending results, returning what we have 46400 1727204572.76254: results queue empty 46400 1727204572.76255: checking for any_errors_fatal 46400 1727204572.76268: done checking for any_errors_fatal 46400 1727204572.76268: checking for max_fail_percentage 46400 1727204572.76270: done checking for max_fail_percentage 46400 1727204572.76271: checking to see if all hosts have failed and the running result is not ok 46400 1727204572.76272: done checking to see if all hosts have failed 46400 1727204572.76273: getting the remaining hosts for this loop 46400 1727204572.76275: done getting the remaining hosts for this loop 46400 1727204572.76279: getting the next task for host managed-node2 46400 1727204572.76288: done getting next task for host managed-node2 46400 1727204572.76293: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204572.76298: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204572.76323: getting variables 46400 1727204572.76325: in VariableManager get_vars() 46400 1727204572.76373: Calling all_inventory to load vars for managed-node2 46400 1727204572.76376: Calling groups_inventory to load vars for managed-node2 46400 1727204572.76378: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204572.76389: Calling all_plugins_play to load vars for managed-node2 46400 1727204572.76392: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204572.76395: Calling groups_plugins_play to load vars for managed-node2 46400 1727204572.78332: done sending task result for task 0affcd87-79f5-1303-fda8-000000001465 46400 1727204572.78336: WORKER PROCESS EXITING 46400 1727204572.95011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204572.97045: done with get_vars() 46400 1727204572.97390: done getting variables 46400 1727204572.97516: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:52 -0400 (0:00:00.308) 0:01:03.260 ***** 46400 1727204572.97549: entering _queue_task() for managed-node2/fail 46400 1727204572.98410: worker is 1 (out of 1 available) 46400 1727204572.98423: exiting _queue_task() for managed-node2/fail 46400 1727204572.98667: done queuing things up, now waiting for results queue to drain 46400 1727204572.98670: waiting for pending results... 46400 1727204572.99307: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204572.99448: in run() - task 0affcd87-79f5-1303-fda8-000000001466 46400 1727204572.99457: variable 'ansible_search_path' from source: unknown 46400 1727204572.99465: variable 'ansible_search_path' from source: unknown 46400 1727204572.99511: calling self._execute() 46400 1727204572.99604: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204572.99608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204572.99612: variable 'omit' from source: magic vars 46400 1727204573.00072: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.00097: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.00247: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.00491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204573.03921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204573.04011: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204573.04073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204573.04111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204573.04144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204573.04240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.04291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.04323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.04387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.04412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.04466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.04508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.04538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.04591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.04619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.04666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.04699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.04738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.04812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.05049: variable 'network_connections' from source: include params 46400 1727204573.05078: variable 'interface' from source: play vars 46400 1727204573.05215: variable 'interface' from source: play vars 46400 1727204573.05357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204573.05730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204573.05769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204573.05800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204573.05847: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204573.05923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204573.05953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204573.05999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.06017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204573.06085: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204573.06256: variable 'network_connections' from source: include params 46400 1727204573.06260: variable 'interface' from source: play vars 46400 1727204573.06348: variable 'interface' from source: play vars 46400 1727204573.06383: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204573.06387: when evaluation is False, skipping this task 46400 1727204573.06390: _execute() done 46400 1727204573.06393: dumping result to json 46400 1727204573.06395: done dumping result, returning 46400 1727204573.06405: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001466] 46400 1727204573.06408: sending task result for task 0affcd87-79f5-1303-fda8-000000001466 46400 1727204573.06594: done sending task result for task 0affcd87-79f5-1303-fda8-000000001466 46400 1727204573.06596: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204573.06647: no more pending results, returning what we have 46400 1727204573.06651: results queue empty 46400 1727204573.06652: checking for any_errors_fatal 46400 1727204573.06680: done checking for any_errors_fatal 46400 1727204573.06681: checking for max_fail_percentage 46400 1727204573.06683: done checking for max_fail_percentage 46400 1727204573.06684: checking to see if all hosts have failed and the running result is not ok 46400 1727204573.06685: done checking to see if all hosts have failed 46400 1727204573.06686: getting the remaining hosts for this loop 46400 1727204573.06688: done getting the remaining hosts for this loop 46400 1727204573.06692: getting the next task for host managed-node2 46400 1727204573.06709: done getting next task for host managed-node2 46400 1727204573.06713: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204573.06732: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204573.06758: getting variables 46400 1727204573.06770: in VariableManager get_vars() 46400 1727204573.06816: Calling all_inventory to load vars for managed-node2 46400 1727204573.06820: Calling groups_inventory to load vars for managed-node2 46400 1727204573.06824: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204573.06836: Calling all_plugins_play to load vars for managed-node2 46400 1727204573.06840: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204573.06843: Calling groups_plugins_play to load vars for managed-node2 46400 1727204573.08357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204573.11201: done with get_vars() 46400 1727204573.11241: done getting variables 46400 1727204573.11323: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.138) 0:01:03.398 ***** 46400 1727204573.11371: entering _queue_task() for managed-node2/package 46400 1727204573.11891: worker is 1 (out of 1 available) 46400 1727204573.11926: exiting _queue_task() for managed-node2/package 46400 1727204573.12039: done queuing things up, now waiting for results queue to drain 46400 1727204573.12041: waiting for pending results... 46400 1727204573.12145: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204573.12290: in run() - task 0affcd87-79f5-1303-fda8-000000001467 46400 1727204573.12300: variable 'ansible_search_path' from source: unknown 46400 1727204573.12304: variable 'ansible_search_path' from source: unknown 46400 1727204573.12347: calling self._execute() 46400 1727204573.12502: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.12506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.12515: variable 'omit' from source: magic vars 46400 1727204573.13029: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.13033: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.13352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204573.13677: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204573.13680: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204573.13683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204573.14024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204573.14028: variable 'network_packages' from source: role '' defaults 46400 1727204573.14087: variable '__network_provider_setup' from source: role '' defaults 46400 1727204573.14090: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204573.14202: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204573.14206: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204573.14276: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204573.14743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204573.18711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204573.18803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204573.18854: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204573.18890: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204573.18917: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204573.19013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.19055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.19200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.19224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.19228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.19230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.19281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.19284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.19367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.19371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.19773: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204573.19928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.19981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.20008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.20048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.20063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.20216: variable 'ansible_python' from source: facts 46400 1727204573.20238: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204573.20383: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204573.20500: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204573.20658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.20714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.20718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.20775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.20789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.20835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.20869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.20893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.20929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.20946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.21324: variable 'network_connections' from source: include params 46400 1727204573.21327: variable 'interface' from source: play vars 46400 1727204573.21471: variable 'interface' from source: play vars 46400 1727204573.21560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204573.21670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204573.21673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.21675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204573.21870: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.22277: variable 'network_connections' from source: include params 46400 1727204573.22282: variable 'interface' from source: play vars 46400 1727204573.22517: variable 'interface' from source: play vars 46400 1727204573.22654: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204573.22870: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.23533: variable 'network_connections' from source: include params 46400 1727204573.23542: variable 'interface' from source: play vars 46400 1727204573.23611: variable 'interface' from source: play vars 46400 1727204573.23640: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204573.23743: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204573.23940: variable 'network_connections' from source: include params 46400 1727204573.23943: variable 'interface' from source: play vars 46400 1727204573.23994: variable 'interface' from source: play vars 46400 1727204573.24034: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204573.24079: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204573.24085: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204573.24135: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204573.24276: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204573.24582: variable 'network_connections' from source: include params 46400 1727204573.24585: variable 'interface' from source: play vars 46400 1727204573.24629: variable 'interface' from source: play vars 46400 1727204573.24635: variable 'ansible_distribution' from source: facts 46400 1727204573.24638: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.24645: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.24656: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204573.24770: variable 'ansible_distribution' from source: facts 46400 1727204573.24774: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.24778: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.24791: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204573.24900: variable 'ansible_distribution' from source: facts 46400 1727204573.24904: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.24908: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.24935: variable 'network_provider' from source: set_fact 46400 1727204573.24946: variable 'ansible_facts' from source: unknown 46400 1727204573.25774: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204573.25777: when evaluation is False, skipping this task 46400 1727204573.25779: _execute() done 46400 1727204573.25780: dumping result to json 46400 1727204573.25782: done dumping result, returning 46400 1727204573.25784: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000001467] 46400 1727204573.25786: sending task result for task 0affcd87-79f5-1303-fda8-000000001467 46400 1727204573.25854: done sending task result for task 0affcd87-79f5-1303-fda8-000000001467 46400 1727204573.25857: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204573.25923: no more pending results, returning what we have 46400 1727204573.25926: results queue empty 46400 1727204573.25927: checking for any_errors_fatal 46400 1727204573.25932: done checking for any_errors_fatal 46400 1727204573.25933: checking for max_fail_percentage 46400 1727204573.25934: done checking for max_fail_percentage 46400 1727204573.25935: checking to see if all hosts have failed and the running result is not ok 46400 1727204573.25936: done checking to see if all hosts have failed 46400 1727204573.25937: getting the remaining hosts for this loop 46400 1727204573.25938: done getting the remaining hosts for this loop 46400 1727204573.25942: getting the next task for host managed-node2 46400 1727204573.25949: done getting next task for host managed-node2 46400 1727204573.25953: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204573.25959: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204573.25986: getting variables 46400 1727204573.25988: in VariableManager get_vars() 46400 1727204573.26030: Calling all_inventory to load vars for managed-node2 46400 1727204573.26033: Calling groups_inventory to load vars for managed-node2 46400 1727204573.26035: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204573.26045: Calling all_plugins_play to load vars for managed-node2 46400 1727204573.26048: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204573.26050: Calling groups_plugins_play to load vars for managed-node2 46400 1727204573.31069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204573.33559: done with get_vars() 46400 1727204573.33602: done getting variables 46400 1727204573.33671: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.223) 0:01:03.621 ***** 46400 1727204573.33717: entering _queue_task() for managed-node2/package 46400 1727204573.34095: worker is 1 (out of 1 available) 46400 1727204573.34113: exiting _queue_task() for managed-node2/package 46400 1727204573.34128: done queuing things up, now waiting for results queue to drain 46400 1727204573.34130: waiting for pending results... 46400 1727204573.34451: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204573.34601: in run() - task 0affcd87-79f5-1303-fda8-000000001468 46400 1727204573.34613: variable 'ansible_search_path' from source: unknown 46400 1727204573.34617: variable 'ansible_search_path' from source: unknown 46400 1727204573.34658: calling self._execute() 46400 1727204573.34810: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.34816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.34827: variable 'omit' from source: magic vars 46400 1727204573.35231: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.35250: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.35387: variable 'network_state' from source: role '' defaults 46400 1727204573.35395: Evaluated conditional (network_state != {}): False 46400 1727204573.35398: when evaluation is False, skipping this task 46400 1727204573.35402: _execute() done 46400 1727204573.35404: dumping result to json 46400 1727204573.35406: done dumping result, returning 46400 1727204573.35415: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001468] 46400 1727204573.35422: sending task result for task 0affcd87-79f5-1303-fda8-000000001468 46400 1727204573.35541: done sending task result for task 0affcd87-79f5-1303-fda8-000000001468 46400 1727204573.35544: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204573.35611: no more pending results, returning what we have 46400 1727204573.35615: results queue empty 46400 1727204573.35617: checking for any_errors_fatal 46400 1727204573.35624: done checking for any_errors_fatal 46400 1727204573.35625: checking for max_fail_percentage 46400 1727204573.35627: done checking for max_fail_percentage 46400 1727204573.35628: checking to see if all hosts have failed and the running result is not ok 46400 1727204573.35629: done checking to see if all hosts have failed 46400 1727204573.35630: getting the remaining hosts for this loop 46400 1727204573.35632: done getting the remaining hosts for this loop 46400 1727204573.35637: getting the next task for host managed-node2 46400 1727204573.35648: done getting next task for host managed-node2 46400 1727204573.35653: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204573.35661: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204573.35690: getting variables 46400 1727204573.35692: in VariableManager get_vars() 46400 1727204573.35737: Calling all_inventory to load vars for managed-node2 46400 1727204573.35740: Calling groups_inventory to load vars for managed-node2 46400 1727204573.35743: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204573.35756: Calling all_plugins_play to load vars for managed-node2 46400 1727204573.35760: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204573.35763: Calling groups_plugins_play to load vars for managed-node2 46400 1727204573.39060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204573.41236: done with get_vars() 46400 1727204573.41281: done getting variables 46400 1727204573.41379: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.077) 0:01:03.698 ***** 46400 1727204573.41430: entering _queue_task() for managed-node2/package 46400 1727204573.41855: worker is 1 (out of 1 available) 46400 1727204573.41879: exiting _queue_task() for managed-node2/package 46400 1727204573.41893: done queuing things up, now waiting for results queue to drain 46400 1727204573.41895: waiting for pending results... 46400 1727204573.42249: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204573.42442: in run() - task 0affcd87-79f5-1303-fda8-000000001469 46400 1727204573.42470: variable 'ansible_search_path' from source: unknown 46400 1727204573.42480: variable 'ansible_search_path' from source: unknown 46400 1727204573.42529: calling self._execute() 46400 1727204573.42645: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.42658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.42681: variable 'omit' from source: magic vars 46400 1727204573.43103: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.43123: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.43267: variable 'network_state' from source: role '' defaults 46400 1727204573.43289: Evaluated conditional (network_state != {}): False 46400 1727204573.43297: when evaluation is False, skipping this task 46400 1727204573.43304: _execute() done 46400 1727204573.43311: dumping result to json 46400 1727204573.43318: done dumping result, returning 46400 1727204573.43337: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001469] 46400 1727204573.43353: sending task result for task 0affcd87-79f5-1303-fda8-000000001469 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204573.43543: no more pending results, returning what we have 46400 1727204573.43547: results queue empty 46400 1727204573.43548: checking for any_errors_fatal 46400 1727204573.43556: done checking for any_errors_fatal 46400 1727204573.43557: checking for max_fail_percentage 46400 1727204573.43559: done checking for max_fail_percentage 46400 1727204573.43567: checking to see if all hosts have failed and the running result is not ok 46400 1727204573.43568: done checking to see if all hosts have failed 46400 1727204573.43568: getting the remaining hosts for this loop 46400 1727204573.43571: done getting the remaining hosts for this loop 46400 1727204573.43578: getting the next task for host managed-node2 46400 1727204573.43589: done getting next task for host managed-node2 46400 1727204573.43594: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204573.43602: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204573.43639: getting variables 46400 1727204573.43641: in VariableManager get_vars() 46400 1727204573.43691: Calling all_inventory to load vars for managed-node2 46400 1727204573.43695: Calling groups_inventory to load vars for managed-node2 46400 1727204573.43700: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204573.43714: Calling all_plugins_play to load vars for managed-node2 46400 1727204573.43718: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204573.43722: Calling groups_plugins_play to load vars for managed-node2 46400 1727204573.44734: done sending task result for task 0affcd87-79f5-1303-fda8-000000001469 46400 1727204573.44738: WORKER PROCESS EXITING 46400 1727204573.45851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204573.47727: done with get_vars() 46400 1727204573.47772: done getting variables 46400 1727204573.47859: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.064) 0:01:03.763 ***** 46400 1727204573.47915: entering _queue_task() for managed-node2/service 46400 1727204573.48373: worker is 1 (out of 1 available) 46400 1727204573.48387: exiting _queue_task() for managed-node2/service 46400 1727204573.48403: done queuing things up, now waiting for results queue to drain 46400 1727204573.48407: waiting for pending results... 46400 1727204573.48804: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204573.49340: in run() - task 0affcd87-79f5-1303-fda8-00000000146a 46400 1727204573.49368: variable 'ansible_search_path' from source: unknown 46400 1727204573.49382: variable 'ansible_search_path' from source: unknown 46400 1727204573.49439: calling self._execute() 46400 1727204573.49565: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.49579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.49594: variable 'omit' from source: magic vars 46400 1727204573.50356: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.50515: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.50779: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.51023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204573.53873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204573.53981: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204573.54053: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204573.54102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204573.54146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204573.54254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.54302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.54343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.54391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.54411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.54475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.54504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.54539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.54590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.54610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.54670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.54697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.54725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.54776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.54797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.55004: variable 'network_connections' from source: include params 46400 1727204573.55023: variable 'interface' from source: play vars 46400 1727204573.55120: variable 'interface' from source: play vars 46400 1727204573.55213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204573.55417: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204573.55482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204573.55525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204573.55562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204573.55619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204573.55653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204573.55692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.55724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204573.55791: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204573.56090: variable 'network_connections' from source: include params 46400 1727204573.56107: variable 'interface' from source: play vars 46400 1727204573.56203: variable 'interface' from source: play vars 46400 1727204573.56238: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204573.56247: when evaluation is False, skipping this task 46400 1727204573.56257: _execute() done 46400 1727204573.56271: dumping result to json 46400 1727204573.56287: done dumping result, returning 46400 1727204573.56301: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000146a] 46400 1727204573.56312: sending task result for task 0affcd87-79f5-1303-fda8-00000000146a skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204573.56482: no more pending results, returning what we have 46400 1727204573.56486: results queue empty 46400 1727204573.56488: checking for any_errors_fatal 46400 1727204573.56496: done checking for any_errors_fatal 46400 1727204573.56497: checking for max_fail_percentage 46400 1727204573.56498: done checking for max_fail_percentage 46400 1727204573.56499: checking to see if all hosts have failed and the running result is not ok 46400 1727204573.56500: done checking to see if all hosts have failed 46400 1727204573.56501: getting the remaining hosts for this loop 46400 1727204573.56503: done getting the remaining hosts for this loop 46400 1727204573.56507: getting the next task for host managed-node2 46400 1727204573.56517: done getting next task for host managed-node2 46400 1727204573.56526: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204573.56534: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204573.56562: getting variables 46400 1727204573.56565: in VariableManager get_vars() 46400 1727204573.56609: Calling all_inventory to load vars for managed-node2 46400 1727204573.56612: Calling groups_inventory to load vars for managed-node2 46400 1727204573.56614: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204573.56626: Calling all_plugins_play to load vars for managed-node2 46400 1727204573.56628: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204573.56631: Calling groups_plugins_play to load vars for managed-node2 46400 1727204573.57806: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146a 46400 1727204573.57811: WORKER PROCESS EXITING 46400 1727204573.60046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204573.62193: done with get_vars() 46400 1727204573.62224: done getting variables 46400 1727204573.62298: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:53 -0400 (0:00:00.144) 0:01:03.907 ***** 46400 1727204573.62336: entering _queue_task() for managed-node2/service 46400 1727204573.62740: worker is 1 (out of 1 available) 46400 1727204573.62762: exiting _queue_task() for managed-node2/service 46400 1727204573.63992: done queuing things up, now waiting for results queue to drain 46400 1727204573.63995: waiting for pending results... 46400 1727204573.66185: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204573.66474: in run() - task 0affcd87-79f5-1303-fda8-00000000146b 46400 1727204573.66488: variable 'ansible_search_path' from source: unknown 46400 1727204573.66492: variable 'ansible_search_path' from source: unknown 46400 1727204573.66531: calling self._execute() 46400 1727204573.66743: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.66750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.66763: variable 'omit' from source: magic vars 46400 1727204573.68410: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.68581: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204573.69100: variable 'network_provider' from source: set_fact 46400 1727204573.69104: variable 'network_state' from source: role '' defaults 46400 1727204573.69119: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204573.69125: variable 'omit' from source: magic vars 46400 1727204573.69368: variable 'omit' from source: magic vars 46400 1727204573.69457: variable 'network_service_name' from source: role '' defaults 46400 1727204573.69717: variable 'network_service_name' from source: role '' defaults 46400 1727204573.69971: variable '__network_provider_setup' from source: role '' defaults 46400 1727204573.69975: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204573.70125: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204573.70134: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204573.70221: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204573.71748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204573.79070: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204573.79256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204573.79410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204573.79445: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204573.79474: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204573.79777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.79807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.80013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.80056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.80074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.80236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.80259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.80286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.80326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.80423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.80909: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204573.81230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.81254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.81281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.81436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.81452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.81619: variable 'ansible_python' from source: facts 46400 1727204573.81767: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204573.82077: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204573.82158: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204573.82510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.82531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.82557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.82595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.82726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.82773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204573.82801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204573.82920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.83000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204573.83015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204573.83431: variable 'network_connections' from source: include params 46400 1727204573.83439: variable 'interface' from source: play vars 46400 1727204573.83649: variable 'interface' from source: play vars 46400 1727204573.83877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204573.84279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204573.84462: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204573.84615: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204573.84618: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204573.84738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204573.84769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204573.84905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204573.84937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204573.85103: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.85817: variable 'network_connections' from source: include params 46400 1727204573.85824: variable 'interface' from source: play vars 46400 1727204573.85907: variable 'interface' from source: play vars 46400 1727204573.85940: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204573.86074: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204573.86703: variable 'network_connections' from source: include params 46400 1727204573.86707: variable 'interface' from source: play vars 46400 1727204573.86803: variable 'interface' from source: play vars 46400 1727204573.86818: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204573.86901: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204573.87206: variable 'network_connections' from source: include params 46400 1727204573.87326: variable 'interface' from source: play vars 46400 1727204573.87395: variable 'interface' from source: play vars 46400 1727204573.88799: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204573.89270: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204573.89274: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204573.89276: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204573.89471: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204573.90247: variable 'network_connections' from source: include params 46400 1727204573.90250: variable 'interface' from source: play vars 46400 1727204573.90422: variable 'interface' from source: play vars 46400 1727204573.90429: variable 'ansible_distribution' from source: facts 46400 1727204573.90432: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.90440: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.90455: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204573.90939: variable 'ansible_distribution' from source: facts 46400 1727204573.90942: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.90948: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.90967: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204573.92381: variable 'ansible_distribution' from source: facts 46400 1727204573.92385: variable '__network_rh_distros' from source: role '' defaults 46400 1727204573.92388: variable 'ansible_distribution_major_version' from source: facts 46400 1727204573.92430: variable 'network_provider' from source: set_fact 46400 1727204573.92455: variable 'omit' from source: magic vars 46400 1727204573.92498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204573.92525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204573.92545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204573.92566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204573.92574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204573.92603: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204573.92606: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.92608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.93021: Set connection var ansible_shell_type to sh 46400 1727204573.93032: Set connection var ansible_shell_executable to /bin/sh 46400 1727204573.93042: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204573.93045: Set connection var ansible_connection to ssh 46400 1727204573.93051: Set connection var ansible_pipelining to False 46400 1727204573.93057: Set connection var ansible_timeout to 10 46400 1727204573.93088: variable 'ansible_shell_executable' from source: unknown 46400 1727204573.93092: variable 'ansible_connection' from source: unknown 46400 1727204573.93095: variable 'ansible_module_compression' from source: unknown 46400 1727204573.93097: variable 'ansible_shell_type' from source: unknown 46400 1727204573.93100: variable 'ansible_shell_executable' from source: unknown 46400 1727204573.93102: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204573.93224: variable 'ansible_pipelining' from source: unknown 46400 1727204573.93227: variable 'ansible_timeout' from source: unknown 46400 1727204573.93230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204573.93447: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204573.93458: variable 'omit' from source: magic vars 46400 1727204573.93466: starting attempt loop 46400 1727204573.93469: running the handler 46400 1727204573.93675: variable 'ansible_facts' from source: unknown 46400 1727204573.95312: _low_level_execute_command(): starting 46400 1727204573.95316: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204573.96930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204573.96944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204573.96955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204573.96976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204573.97018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204573.97085: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204573.97096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204573.97109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204573.97119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204573.97124: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204573.97139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204573.97149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204573.97186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204573.97196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204573.97204: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204573.97218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204573.97328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204573.97472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204573.97486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204573.97573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204573.99225: stdout chunk (state=3): >>>/root <<< 46400 1727204573.99415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204573.99419: stdout chunk (state=3): >>><<< 46400 1727204573.99426: stderr chunk (state=3): >>><<< 46400 1727204573.99453: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204573.99469: _low_level_execute_command(): starting 46400 1727204573.99476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443 `" && echo ansible-tmp-1727204573.9945424-50734-80868755721443="` echo /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443 `" ) && sleep 0' 46400 1727204574.01908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204574.03314: stdout chunk (state=3): >>>ansible-tmp-1727204573.9945424-50734-80868755721443=/root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443 <<< 46400 1727204574.03471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204574.03506: stderr chunk (state=3): >>><<< 46400 1727204574.03511: stdout chunk (state=3): >>><<< 46400 1727204574.03530: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204573.9945424-50734-80868755721443=/root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204574.03568: variable 'ansible_module_compression' from source: unknown 46400 1727204574.03625: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204574.03681: variable 'ansible_facts' from source: unknown 46400 1727204574.03875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/AnsiballZ_systemd.py 46400 1727204574.04910: Sending initial data 46400 1727204574.04919: Sent initial data (155 bytes) 46400 1727204574.07658: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204574.07667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.07718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.07742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.07783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.07820: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204574.07830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.07849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204574.07929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204574.07935: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204574.07945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.07965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.07975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.07982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.07989: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204574.07998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.08157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204574.08185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204574.08198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204574.08270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204574.10091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204574.10121: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204574.10162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp4adnby3h /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/AnsiballZ_systemd.py <<< 46400 1727204574.10200: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204574.13276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204574.13537: stderr chunk (state=3): >>><<< 46400 1727204574.13540: stdout chunk (state=3): >>><<< 46400 1727204574.13543: done transferring module to remote 46400 1727204574.13545: _low_level_execute_command(): starting 46400 1727204574.13547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/ /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/AnsiballZ_systemd.py && sleep 0' 46400 1727204574.15112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204574.15129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.15143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.15162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.15216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.15228: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204574.15248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.15275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204574.15288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204574.15305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204574.15318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.15332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.15348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.15361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.15377: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204574.15391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.15473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204574.15490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204574.15505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204574.15597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204574.17471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204574.17478: stdout chunk (state=3): >>><<< 46400 1727204574.17489: stderr chunk (state=3): >>><<< 46400 1727204574.17552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204574.17604: _low_level_execute_command(): starting 46400 1727204574.17608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/AnsiballZ_systemd.py && sleep 0' 46400 1727204574.19308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204574.19312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.19314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.19317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.19354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.19362: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204574.19376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.19390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204574.19398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204574.19405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204574.19418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.19421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.19431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.19439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204574.19446: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204574.19455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.19529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204574.19547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204574.19559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204574.19633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204574.44883: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204574.44924: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "7020544", "MemoryAvailable": "infinity", "CPUUsageNSec": "2134121000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204574.46368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204574.46372: stdout chunk (state=3): >>><<< 46400 1727204574.46375: stderr chunk (state=3): >>><<< 46400 1727204574.46399: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "7020544", "MemoryAvailable": "infinity", "CPUUsageNSec": "2134121000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204574.46586: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204574.46603: _low_level_execute_command(): starting 46400 1727204574.46608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204573.9945424-50734-80868755721443/ > /dev/null 2>&1 && sleep 0' 46400 1727204574.48341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.48345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204574.48468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204574.48473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204574.48499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204574.48503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204574.48595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204574.48717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204574.48739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204574.48933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204574.50971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204574.50975: stdout chunk (state=3): >>><<< 46400 1727204574.50993: stderr chunk (state=3): >>><<< 46400 1727204574.51471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204574.51475: handler run complete 46400 1727204574.51478: attempt loop complete, returning result 46400 1727204574.51480: _execute() done 46400 1727204574.51482: dumping result to json 46400 1727204574.51484: done dumping result, returning 46400 1727204574.51486: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-00000000146b] 46400 1727204574.51488: sending task result for task 0affcd87-79f5-1303-fda8-00000000146b 46400 1727204574.51653: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146b 46400 1727204574.51657: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204574.51713: no more pending results, returning what we have 46400 1727204574.51717: results queue empty 46400 1727204574.51718: checking for any_errors_fatal 46400 1727204574.51723: done checking for any_errors_fatal 46400 1727204574.51723: checking for max_fail_percentage 46400 1727204574.51725: done checking for max_fail_percentage 46400 1727204574.51726: checking to see if all hosts have failed and the running result is not ok 46400 1727204574.51726: done checking to see if all hosts have failed 46400 1727204574.51727: getting the remaining hosts for this loop 46400 1727204574.51728: done getting the remaining hosts for this loop 46400 1727204574.51731: getting the next task for host managed-node2 46400 1727204574.51740: done getting next task for host managed-node2 46400 1727204574.51744: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204574.51750: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204574.51763: getting variables 46400 1727204574.51764: in VariableManager get_vars() 46400 1727204574.51797: Calling all_inventory to load vars for managed-node2 46400 1727204574.51799: Calling groups_inventory to load vars for managed-node2 46400 1727204574.51801: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204574.51810: Calling all_plugins_play to load vars for managed-node2 46400 1727204574.51813: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204574.51815: Calling groups_plugins_play to load vars for managed-node2 46400 1727204574.54644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204574.58654: done with get_vars() 46400 1727204574.58773: done getting variables 46400 1727204574.58954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:54 -0400 (0:00:00.972) 0:01:04.880 ***** 46400 1727204574.59636: entering _queue_task() for managed-node2/service 46400 1727204574.60669: worker is 1 (out of 1 available) 46400 1727204574.60963: exiting _queue_task() for managed-node2/service 46400 1727204574.60978: done queuing things up, now waiting for results queue to drain 46400 1727204574.60980: waiting for pending results... 46400 1727204574.61383: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204574.62107: in run() - task 0affcd87-79f5-1303-fda8-00000000146c 46400 1727204574.62790: variable 'ansible_search_path' from source: unknown 46400 1727204574.62799: variable 'ansible_search_path' from source: unknown 46400 1727204574.62872: calling self._execute() 46400 1727204574.63037: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204574.63164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204574.63623: variable 'omit' from source: magic vars 46400 1727204574.64002: variable 'ansible_distribution_major_version' from source: facts 46400 1727204574.64389: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204574.64522: variable 'network_provider' from source: set_fact 46400 1727204574.64718: Evaluated conditional (network_provider == "nm"): True 46400 1727204574.64821: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204574.64985: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204574.65547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204574.71269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204574.71587: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204574.71701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204574.71742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204574.71867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204574.71949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204574.72110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204574.72144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204574.72216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204574.72330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204574.72385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204574.72410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204574.72536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204574.72585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204574.72602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204574.72688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204574.72782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204574.72997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204574.73039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204574.73058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204574.73800: variable 'network_connections' from source: include params 46400 1727204574.73819: variable 'interface' from source: play vars 46400 1727204574.73900: variable 'interface' from source: play vars 46400 1727204574.73974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204574.74177: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204574.74219: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204574.74253: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204574.74291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204574.74341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204574.74373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204574.74597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204574.74629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204574.74688: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204574.74957: variable 'network_connections' from source: include params 46400 1727204574.75187: variable 'interface' from source: play vars 46400 1727204574.75256: variable 'interface' from source: play vars 46400 1727204574.75401: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204574.75411: when evaluation is False, skipping this task 46400 1727204574.75419: _execute() done 46400 1727204574.75426: dumping result to json 46400 1727204574.75433: done dumping result, returning 46400 1727204574.75445: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-00000000146c] 46400 1727204574.75470: sending task result for task 0affcd87-79f5-1303-fda8-00000000146c skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204574.75718: no more pending results, returning what we have 46400 1727204574.75723: results queue empty 46400 1727204574.75724: checking for any_errors_fatal 46400 1727204574.75746: done checking for any_errors_fatal 46400 1727204574.75746: checking for max_fail_percentage 46400 1727204574.75748: done checking for max_fail_percentage 46400 1727204574.75749: checking to see if all hosts have failed and the running result is not ok 46400 1727204574.75750: done checking to see if all hosts have failed 46400 1727204574.75750: getting the remaining hosts for this loop 46400 1727204574.75752: done getting the remaining hosts for this loop 46400 1727204574.75756: getting the next task for host managed-node2 46400 1727204574.75767: done getting next task for host managed-node2 46400 1727204574.75771: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204574.75777: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204574.75805: getting variables 46400 1727204574.75807: in VariableManager get_vars() 46400 1727204574.75851: Calling all_inventory to load vars for managed-node2 46400 1727204574.75854: Calling groups_inventory to load vars for managed-node2 46400 1727204574.75857: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204574.75871: Calling all_plugins_play to load vars for managed-node2 46400 1727204574.75874: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204574.75877: Calling groups_plugins_play to load vars for managed-node2 46400 1727204574.76679: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146c 46400 1727204574.76683: WORKER PROCESS EXITING 46400 1727204574.81581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204574.87094: done with get_vars() 46400 1727204574.87158: done getting variables 46400 1727204574.87279: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:54 -0400 (0:00:00.277) 0:01:05.157 ***** 46400 1727204574.87344: entering _queue_task() for managed-node2/service 46400 1727204574.87815: worker is 1 (out of 1 available) 46400 1727204574.87828: exiting _queue_task() for managed-node2/service 46400 1727204574.87842: done queuing things up, now waiting for results queue to drain 46400 1727204574.87844: waiting for pending results... 46400 1727204574.89988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204574.90558: in run() - task 0affcd87-79f5-1303-fda8-00000000146d 46400 1727204574.90577: variable 'ansible_search_path' from source: unknown 46400 1727204574.90581: variable 'ansible_search_path' from source: unknown 46400 1727204574.90615: calling self._execute() 46400 1727204574.91005: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204574.91009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204574.91086: variable 'omit' from source: magic vars 46400 1727204574.91539: variable 'ansible_distribution_major_version' from source: facts 46400 1727204574.91551: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204574.91682: variable 'network_provider' from source: set_fact 46400 1727204574.91687: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204574.91690: when evaluation is False, skipping this task 46400 1727204574.91694: _execute() done 46400 1727204574.91697: dumping result to json 46400 1727204574.91699: done dumping result, returning 46400 1727204574.91706: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-00000000146d] 46400 1727204574.91712: sending task result for task 0affcd87-79f5-1303-fda8-00000000146d 46400 1727204574.91820: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146d 46400 1727204574.91823: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204574.91872: no more pending results, returning what we have 46400 1727204574.91876: results queue empty 46400 1727204574.91877: checking for any_errors_fatal 46400 1727204574.91885: done checking for any_errors_fatal 46400 1727204574.91886: checking for max_fail_percentage 46400 1727204574.91887: done checking for max_fail_percentage 46400 1727204574.91888: checking to see if all hosts have failed and the running result is not ok 46400 1727204574.91889: done checking to see if all hosts have failed 46400 1727204574.91889: getting the remaining hosts for this loop 46400 1727204574.91891: done getting the remaining hosts for this loop 46400 1727204574.91895: getting the next task for host managed-node2 46400 1727204574.91904: done getting next task for host managed-node2 46400 1727204574.91909: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204574.91916: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204574.91941: getting variables 46400 1727204574.91943: in VariableManager get_vars() 46400 1727204574.91986: Calling all_inventory to load vars for managed-node2 46400 1727204574.91989: Calling groups_inventory to load vars for managed-node2 46400 1727204574.91991: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204574.92001: Calling all_plugins_play to load vars for managed-node2 46400 1727204574.92004: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204574.92006: Calling groups_plugins_play to load vars for managed-node2 46400 1727204574.95487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204574.99780: done with get_vars() 46400 1727204574.99845: done getting variables 46400 1727204574.99925: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:54 -0400 (0:00:00.126) 0:01:05.284 ***** 46400 1727204574.99967: entering _queue_task() for managed-node2/copy 46400 1727204575.00363: worker is 1 (out of 1 available) 46400 1727204575.00379: exiting _queue_task() for managed-node2/copy 46400 1727204575.00393: done queuing things up, now waiting for results queue to drain 46400 1727204575.00395: waiting for pending results... 46400 1727204575.01679: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204575.01869: in run() - task 0affcd87-79f5-1303-fda8-00000000146e 46400 1727204575.01890: variable 'ansible_search_path' from source: unknown 46400 1727204575.01898: variable 'ansible_search_path' from source: unknown 46400 1727204575.01944: calling self._execute() 46400 1727204575.02071: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204575.02083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204575.02096: variable 'omit' from source: magic vars 46400 1727204575.02505: variable 'ansible_distribution_major_version' from source: facts 46400 1727204575.02525: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204575.02644: variable 'network_provider' from source: set_fact 46400 1727204575.02654: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204575.02665: when evaluation is False, skipping this task 46400 1727204575.02673: _execute() done 46400 1727204575.02680: dumping result to json 46400 1727204575.02687: done dumping result, returning 46400 1727204575.02698: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-00000000146e] 46400 1727204575.02711: sending task result for task 0affcd87-79f5-1303-fda8-00000000146e skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204575.02893: no more pending results, returning what we have 46400 1727204575.02898: results queue empty 46400 1727204575.02900: checking for any_errors_fatal 46400 1727204575.02908: done checking for any_errors_fatal 46400 1727204575.02909: checking for max_fail_percentage 46400 1727204575.02911: done checking for max_fail_percentage 46400 1727204575.02912: checking to see if all hosts have failed and the running result is not ok 46400 1727204575.02913: done checking to see if all hosts have failed 46400 1727204575.02914: getting the remaining hosts for this loop 46400 1727204575.02916: done getting the remaining hosts for this loop 46400 1727204575.02921: getting the next task for host managed-node2 46400 1727204575.02932: done getting next task for host managed-node2 46400 1727204575.02940: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204575.02946: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204575.02978: getting variables 46400 1727204575.02980: in VariableManager get_vars() 46400 1727204575.03028: Calling all_inventory to load vars for managed-node2 46400 1727204575.03031: Calling groups_inventory to load vars for managed-node2 46400 1727204575.03034: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204575.03048: Calling all_plugins_play to load vars for managed-node2 46400 1727204575.03051: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204575.03054: Calling groups_plugins_play to load vars for managed-node2 46400 1727204575.04030: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146e 46400 1727204575.04033: WORKER PROCESS EXITING 46400 1727204575.05416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204575.09447: done with get_vars() 46400 1727204575.09601: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.097) 0:01:05.382 ***** 46400 1727204575.09769: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204575.10639: worker is 1 (out of 1 available) 46400 1727204575.10652: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204575.10774: done queuing things up, now waiting for results queue to drain 46400 1727204575.10780: waiting for pending results... 46400 1727204575.11636: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204575.12002: in run() - task 0affcd87-79f5-1303-fda8-00000000146f 46400 1727204575.12016: variable 'ansible_search_path' from source: unknown 46400 1727204575.12020: variable 'ansible_search_path' from source: unknown 46400 1727204575.12058: calling self._execute() 46400 1727204575.12270: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204575.12276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204575.12288: variable 'omit' from source: magic vars 46400 1727204575.13124: variable 'ansible_distribution_major_version' from source: facts 46400 1727204575.13136: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204575.13143: variable 'omit' from source: magic vars 46400 1727204575.13323: variable 'omit' from source: magic vars 46400 1727204575.13588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204575.19109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204575.19236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204575.19420: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204575.19458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204575.19604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204575.19701: variable 'network_provider' from source: set_fact 46400 1727204575.19991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204575.20181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204575.20215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204575.20382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204575.20403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204575.20515: variable 'omit' from source: magic vars 46400 1727204575.20828: variable 'omit' from source: magic vars 46400 1727204575.21066: variable 'network_connections' from source: include params 46400 1727204575.21086: variable 'interface' from source: play vars 46400 1727204575.21198: variable 'interface' from source: play vars 46400 1727204575.21620: variable 'omit' from source: magic vars 46400 1727204575.21634: variable '__lsr_ansible_managed' from source: task vars 46400 1727204575.21729: variable '__lsr_ansible_managed' from source: task vars 46400 1727204575.22185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204575.22720: Loaded config def from plugin (lookup/template) 46400 1727204575.22882: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204575.22920: File lookup term: get_ansible_managed.j2 46400 1727204575.22929: variable 'ansible_search_path' from source: unknown 46400 1727204575.22939: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204575.22957: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204575.23004: variable 'ansible_search_path' from source: unknown 46400 1727204575.34827: variable 'ansible_managed' from source: unknown 46400 1727204575.35001: variable 'omit' from source: magic vars 46400 1727204575.35041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204575.35078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204575.35108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204575.35135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204575.35149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204575.35185: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204575.35201: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204575.35210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204575.35320: Set connection var ansible_shell_type to sh 46400 1727204575.35336: Set connection var ansible_shell_executable to /bin/sh 46400 1727204575.35351: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204575.35366: Set connection var ansible_connection to ssh 46400 1727204575.35377: Set connection var ansible_pipelining to False 46400 1727204575.35387: Set connection var ansible_timeout to 10 46400 1727204575.35424: variable 'ansible_shell_executable' from source: unknown 46400 1727204575.35432: variable 'ansible_connection' from source: unknown 46400 1727204575.35439: variable 'ansible_module_compression' from source: unknown 46400 1727204575.35445: variable 'ansible_shell_type' from source: unknown 46400 1727204575.35454: variable 'ansible_shell_executable' from source: unknown 46400 1727204575.35470: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204575.35478: variable 'ansible_pipelining' from source: unknown 46400 1727204575.35485: variable 'ansible_timeout' from source: unknown 46400 1727204575.35493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204575.35650: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204575.35688: variable 'omit' from source: magic vars 46400 1727204575.35698: starting attempt loop 46400 1727204575.35705: running the handler 46400 1727204575.35722: _low_level_execute_command(): starting 46400 1727204575.35739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204575.36580: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204575.36597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.36617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.36637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.36691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.36704: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204575.36719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.36742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204575.36754: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204575.36775: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204575.36789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.36804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.36819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.36837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.36851: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204575.36868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.36955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.36978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204575.36997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.37177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.38786: stdout chunk (state=3): >>>/root <<< 46400 1727204575.39000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204575.39004: stdout chunk (state=3): >>><<< 46400 1727204575.39007: stderr chunk (state=3): >>><<< 46400 1727204575.39133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204575.39137: _low_level_execute_command(): starting 46400 1727204575.39141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622 `" && echo ansible-tmp-1727204575.3902879-50825-216887957716622="` echo /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622 `" ) && sleep 0' 46400 1727204575.40500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.40505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.40652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.40658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.40660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.40720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.40852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204575.40856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.40911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.42779: stdout chunk (state=3): >>>ansible-tmp-1727204575.3902879-50825-216887957716622=/root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622 <<< 46400 1727204575.42895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204575.42983: stderr chunk (state=3): >>><<< 46400 1727204575.42987: stdout chunk (state=3): >>><<< 46400 1727204575.43275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204575.3902879-50825-216887957716622=/root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204575.43279: variable 'ansible_module_compression' from source: unknown 46400 1727204575.43281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204575.43283: variable 'ansible_facts' from source: unknown 46400 1727204575.43291: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/AnsiballZ_network_connections.py 46400 1727204575.43938: Sending initial data 46400 1727204575.43941: Sent initial data (168 bytes) 46400 1727204575.46908: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.46913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.47051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.47055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.47268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.47282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.47345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.49082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204575.49128: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204575.49188: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpm_ezomzo /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/AnsiballZ_network_connections.py <<< 46400 1727204575.49241: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204575.51209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204575.51467: stderr chunk (state=3): >>><<< 46400 1727204575.51471: stdout chunk (state=3): >>><<< 46400 1727204575.51473: done transferring module to remote 46400 1727204575.51475: _low_level_execute_command(): starting 46400 1727204575.51477: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/ /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/AnsiballZ_network_connections.py && sleep 0' 46400 1727204575.52743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204575.52758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.52777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.52803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.52847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.52860: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204575.52876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.52900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204575.52916: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204575.52927: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204575.52938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.52950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.52967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.52979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.52989: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204575.53003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.53087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.53244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204575.53261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.53423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.55231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204575.55235: stdout chunk (state=3): >>><<< 46400 1727204575.55237: stderr chunk (state=3): >>><<< 46400 1727204575.55344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204575.55349: _low_level_execute_command(): starting 46400 1727204575.55351: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/AnsiballZ_network_connections.py && sleep 0' 46400 1727204575.56244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204575.56266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.56286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.56305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.56353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.56372: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204575.56388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.56411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204575.56424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204575.56440: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204575.56451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204575.56463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.56484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.56494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204575.56503: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204575.56513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.56599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.56617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204575.56633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.56720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.80713: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e8z7r7io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e8z7r7io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/32d7bf17-3bad-4841-bdea-bee9f6832024: error=unknown <<< 46400 1727204575.80902: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204575.82501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204575.82505: stdout chunk (state=3): >>><<< 46400 1727204575.82508: stderr chunk (state=3): >>><<< 46400 1727204575.82656: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e8z7r7io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e8z7r7io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/32d7bf17-3bad-4841-bdea-bee9f6832024: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204575.82660: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204575.82663: _low_level_execute_command(): starting 46400 1727204575.82668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204575.3902879-50825-216887957716622/ > /dev/null 2>&1 && sleep 0' 46400 1727204575.85140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204575.85145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.85370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.85374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204575.85377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204575.85519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204575.85537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204575.85552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204575.85631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204575.87551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204575.87555: stdout chunk (state=3): >>><<< 46400 1727204575.87557: stderr chunk (state=3): >>><<< 46400 1727204575.87776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204575.87780: handler run complete 46400 1727204575.87782: attempt loop complete, returning result 46400 1727204575.87785: _execute() done 46400 1727204575.87786: dumping result to json 46400 1727204575.87788: done dumping result, returning 46400 1727204575.87790: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-00000000146f] 46400 1727204575.87792: sending task result for task 0affcd87-79f5-1303-fda8-00000000146f 46400 1727204575.87872: done sending task result for task 0affcd87-79f5-1303-fda8-00000000146f 46400 1727204575.87878: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 46400 1727204575.87990: no more pending results, returning what we have 46400 1727204575.87994: results queue empty 46400 1727204575.87996: checking for any_errors_fatal 46400 1727204575.88002: done checking for any_errors_fatal 46400 1727204575.88003: checking for max_fail_percentage 46400 1727204575.88005: done checking for max_fail_percentage 46400 1727204575.88006: checking to see if all hosts have failed and the running result is not ok 46400 1727204575.88006: done checking to see if all hosts have failed 46400 1727204575.88007: getting the remaining hosts for this loop 46400 1727204575.88009: done getting the remaining hosts for this loop 46400 1727204575.88013: getting the next task for host managed-node2 46400 1727204575.88022: done getting next task for host managed-node2 46400 1727204575.88027: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204575.88032: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204575.88047: getting variables 46400 1727204575.88049: in VariableManager get_vars() 46400 1727204575.88093: Calling all_inventory to load vars for managed-node2 46400 1727204575.88096: Calling groups_inventory to load vars for managed-node2 46400 1727204575.88099: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204575.88109: Calling all_plugins_play to load vars for managed-node2 46400 1727204575.88113: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204575.88116: Calling groups_plugins_play to load vars for managed-node2 46400 1727204575.93077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204575.96599: done with get_vars() 46400 1727204575.96641: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:55 -0400 (0:00:00.869) 0:01:06.251 ***** 46400 1727204575.96729: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204575.97100: worker is 1 (out of 1 available) 46400 1727204575.97112: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204575.97146: done queuing things up, now waiting for results queue to drain 46400 1727204575.97149: waiting for pending results... 46400 1727204575.98020: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204575.98187: in run() - task 0affcd87-79f5-1303-fda8-000000001470 46400 1727204575.98207: variable 'ansible_search_path' from source: unknown 46400 1727204575.98214: variable 'ansible_search_path' from source: unknown 46400 1727204575.98266: calling self._execute() 46400 1727204575.98376: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204575.98389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204575.98402: variable 'omit' from source: magic vars 46400 1727204575.98788: variable 'ansible_distribution_major_version' from source: facts 46400 1727204575.98811: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204575.98935: variable 'network_state' from source: role '' defaults 46400 1727204575.98952: Evaluated conditional (network_state != {}): False 46400 1727204575.98960: when evaluation is False, skipping this task 46400 1727204575.98970: _execute() done 46400 1727204575.98978: dumping result to json 46400 1727204575.98985: done dumping result, returning 46400 1727204575.98996: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000001470] 46400 1727204575.99010: sending task result for task 0affcd87-79f5-1303-fda8-000000001470 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204575.99177: no more pending results, returning what we have 46400 1727204575.99182: results queue empty 46400 1727204575.99183: checking for any_errors_fatal 46400 1727204575.99196: done checking for any_errors_fatal 46400 1727204575.99197: checking for max_fail_percentage 46400 1727204575.99199: done checking for max_fail_percentage 46400 1727204575.99200: checking to see if all hosts have failed and the running result is not ok 46400 1727204575.99201: done checking to see if all hosts have failed 46400 1727204575.99201: getting the remaining hosts for this loop 46400 1727204575.99203: done getting the remaining hosts for this loop 46400 1727204575.99208: getting the next task for host managed-node2 46400 1727204575.99216: done getting next task for host managed-node2 46400 1727204575.99221: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204575.99230: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204575.99259: getting variables 46400 1727204575.99261: in VariableManager get_vars() 46400 1727204575.99309: Calling all_inventory to load vars for managed-node2 46400 1727204575.99312: Calling groups_inventory to load vars for managed-node2 46400 1727204575.99315: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204575.99329: Calling all_plugins_play to load vars for managed-node2 46400 1727204575.99332: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204575.99335: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.00314: done sending task result for task 0affcd87-79f5-1303-fda8-000000001470 46400 1727204576.00317: WORKER PROCESS EXITING 46400 1727204576.01841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.04923: done with get_vars() 46400 1727204576.04959: done getting variables 46400 1727204576.05033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.083) 0:01:06.335 ***** 46400 1727204576.05074: entering _queue_task() for managed-node2/debug 46400 1727204576.05439: worker is 1 (out of 1 available) 46400 1727204576.05457: exiting _queue_task() for managed-node2/debug 46400 1727204576.05474: done queuing things up, now waiting for results queue to drain 46400 1727204576.05476: waiting for pending results... 46400 1727204576.05789: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204576.05952: in run() - task 0affcd87-79f5-1303-fda8-000000001471 46400 1727204576.05973: variable 'ansible_search_path' from source: unknown 46400 1727204576.05981: variable 'ansible_search_path' from source: unknown 46400 1727204576.06028: calling self._execute() 46400 1727204576.06136: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.06146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.06156: variable 'omit' from source: magic vars 46400 1727204576.06532: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.06549: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.06570: variable 'omit' from source: magic vars 46400 1727204576.06641: variable 'omit' from source: magic vars 46400 1727204576.06687: variable 'omit' from source: magic vars 46400 1727204576.06728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204576.06772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204576.06803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204576.06824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.06837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.06872: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204576.06962: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.06973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.07069: Set connection var ansible_shell_type to sh 46400 1727204576.07085: Set connection var ansible_shell_executable to /bin/sh 46400 1727204576.07100: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204576.07110: Set connection var ansible_connection to ssh 46400 1727204576.07120: Set connection var ansible_pipelining to False 46400 1727204576.07147: Set connection var ansible_timeout to 10 46400 1727204576.07185: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.07194: variable 'ansible_connection' from source: unknown 46400 1727204576.07205: variable 'ansible_module_compression' from source: unknown 46400 1727204576.07218: variable 'ansible_shell_type' from source: unknown 46400 1727204576.07228: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.07242: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.07256: variable 'ansible_pipelining' from source: unknown 46400 1727204576.07266: variable 'ansible_timeout' from source: unknown 46400 1727204576.07275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.07441: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204576.07461: variable 'omit' from source: magic vars 46400 1727204576.07478: starting attempt loop 46400 1727204576.07485: running the handler 46400 1727204576.07629: variable '__network_connections_result' from source: set_fact 46400 1727204576.07699: handler run complete 46400 1727204576.07722: attempt loop complete, returning result 46400 1727204576.07729: _execute() done 46400 1727204576.07736: dumping result to json 46400 1727204576.07748: done dumping result, returning 46400 1727204576.07761: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000001471] 46400 1727204576.07774: sending task result for task 0affcd87-79f5-1303-fda8-000000001471 46400 1727204576.07894: done sending task result for task 0affcd87-79f5-1303-fda8-000000001471 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 46400 1727204576.07977: no more pending results, returning what we have 46400 1727204576.07982: results queue empty 46400 1727204576.07983: checking for any_errors_fatal 46400 1727204576.07989: done checking for any_errors_fatal 46400 1727204576.07992: checking for max_fail_percentage 46400 1727204576.07994: done checking for max_fail_percentage 46400 1727204576.07995: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.07996: done checking to see if all hosts have failed 46400 1727204576.07997: getting the remaining hosts for this loop 46400 1727204576.07998: done getting the remaining hosts for this loop 46400 1727204576.08002: getting the next task for host managed-node2 46400 1727204576.08012: done getting next task for host managed-node2 46400 1727204576.08016: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204576.08022: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.08036: getting variables 46400 1727204576.08039: in VariableManager get_vars() 46400 1727204576.08087: Calling all_inventory to load vars for managed-node2 46400 1727204576.08091: Calling groups_inventory to load vars for managed-node2 46400 1727204576.08094: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.08107: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.08110: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.08113: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.09138: WORKER PROCESS EXITING 46400 1727204576.10918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.12620: done with get_vars() 46400 1727204576.12651: done getting variables 46400 1727204576.12714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.076) 0:01:06.412 ***** 46400 1727204576.12763: entering _queue_task() for managed-node2/debug 46400 1727204576.13226: worker is 1 (out of 1 available) 46400 1727204576.13240: exiting _queue_task() for managed-node2/debug 46400 1727204576.13253: done queuing things up, now waiting for results queue to drain 46400 1727204576.13255: waiting for pending results... 46400 1727204576.13561: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204576.13749: in run() - task 0affcd87-79f5-1303-fda8-000000001472 46400 1727204576.13772: variable 'ansible_search_path' from source: unknown 46400 1727204576.13780: variable 'ansible_search_path' from source: unknown 46400 1727204576.13866: calling self._execute() 46400 1727204576.14013: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.14031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.14047: variable 'omit' from source: magic vars 46400 1727204576.14446: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.14469: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.14481: variable 'omit' from source: magic vars 46400 1727204576.14545: variable 'omit' from source: magic vars 46400 1727204576.14591: variable 'omit' from source: magic vars 46400 1727204576.14643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204576.14691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204576.14791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204576.14815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.14840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.14877: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204576.14940: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.14948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.15103: Set connection var ansible_shell_type to sh 46400 1727204576.15182: Set connection var ansible_shell_executable to /bin/sh 46400 1727204576.15228: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204576.15240: Set connection var ansible_connection to ssh 46400 1727204576.15250: Set connection var ansible_pipelining to False 46400 1727204576.15266: Set connection var ansible_timeout to 10 46400 1727204576.15369: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.15381: variable 'ansible_connection' from source: unknown 46400 1727204576.15389: variable 'ansible_module_compression' from source: unknown 46400 1727204576.15396: variable 'ansible_shell_type' from source: unknown 46400 1727204576.15403: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.15441: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.15450: variable 'ansible_pipelining' from source: unknown 46400 1727204576.15457: variable 'ansible_timeout' from source: unknown 46400 1727204576.15471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.15812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204576.15825: variable 'omit' from source: magic vars 46400 1727204576.15830: starting attempt loop 46400 1727204576.15833: running the handler 46400 1727204576.15905: variable '__network_connections_result' from source: set_fact 46400 1727204576.16000: variable '__network_connections_result' from source: set_fact 46400 1727204576.16113: handler run complete 46400 1727204576.16138: attempt loop complete, returning result 46400 1727204576.16141: _execute() done 46400 1727204576.16144: dumping result to json 46400 1727204576.16148: done dumping result, returning 46400 1727204576.16156: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000001472] 46400 1727204576.16162: sending task result for task 0affcd87-79f5-1303-fda8-000000001472 46400 1727204576.16267: done sending task result for task 0affcd87-79f5-1303-fda8-000000001472 46400 1727204576.16270: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 46400 1727204576.16390: no more pending results, returning what we have 46400 1727204576.16394: results queue empty 46400 1727204576.16396: checking for any_errors_fatal 46400 1727204576.16403: done checking for any_errors_fatal 46400 1727204576.16404: checking for max_fail_percentage 46400 1727204576.16406: done checking for max_fail_percentage 46400 1727204576.16407: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.16408: done checking to see if all hosts have failed 46400 1727204576.16408: getting the remaining hosts for this loop 46400 1727204576.16410: done getting the remaining hosts for this loop 46400 1727204576.16413: getting the next task for host managed-node2 46400 1727204576.16421: done getting next task for host managed-node2 46400 1727204576.16424: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204576.16429: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.16440: getting variables 46400 1727204576.16442: in VariableManager get_vars() 46400 1727204576.16478: Calling all_inventory to load vars for managed-node2 46400 1727204576.16481: Calling groups_inventory to load vars for managed-node2 46400 1727204576.16483: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.16492: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.16494: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.16497: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.18015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.20120: done with get_vars() 46400 1727204576.20265: done getting variables 46400 1727204576.20331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.076) 0:01:06.489 ***** 46400 1727204576.20489: entering _queue_task() for managed-node2/debug 46400 1727204576.20996: worker is 1 (out of 1 available) 46400 1727204576.21014: exiting _queue_task() for managed-node2/debug 46400 1727204576.21054: done queuing things up, now waiting for results queue to drain 46400 1727204576.21059: waiting for pending results... 46400 1727204576.21417: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204576.21576: in run() - task 0affcd87-79f5-1303-fda8-000000001473 46400 1727204576.21597: variable 'ansible_search_path' from source: unknown 46400 1727204576.21780: variable 'ansible_search_path' from source: unknown 46400 1727204576.21821: calling self._execute() 46400 1727204576.21924: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.21935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.21948: variable 'omit' from source: magic vars 46400 1727204576.22334: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.22350: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.22476: variable 'network_state' from source: role '' defaults 46400 1727204576.22493: Evaluated conditional (network_state != {}): False 46400 1727204576.22501: when evaluation is False, skipping this task 46400 1727204576.22507: _execute() done 46400 1727204576.22515: dumping result to json 46400 1727204576.22522: done dumping result, returning 46400 1727204576.22539: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000001473] 46400 1727204576.22550: sending task result for task 0affcd87-79f5-1303-fda8-000000001473 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204576.22698: no more pending results, returning what we have 46400 1727204576.22702: results queue empty 46400 1727204576.22703: checking for any_errors_fatal 46400 1727204576.22711: done checking for any_errors_fatal 46400 1727204576.22712: checking for max_fail_percentage 46400 1727204576.22714: done checking for max_fail_percentage 46400 1727204576.22715: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.22716: done checking to see if all hosts have failed 46400 1727204576.22717: getting the remaining hosts for this loop 46400 1727204576.22718: done getting the remaining hosts for this loop 46400 1727204576.22722: getting the next task for host managed-node2 46400 1727204576.22732: done getting next task for host managed-node2 46400 1727204576.22736: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204576.22741: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.22775: getting variables 46400 1727204576.22778: in VariableManager get_vars() 46400 1727204576.22825: Calling all_inventory to load vars for managed-node2 46400 1727204576.22828: Calling groups_inventory to load vars for managed-node2 46400 1727204576.22831: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.22844: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.22847: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.22850: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.23905: done sending task result for task 0affcd87-79f5-1303-fda8-000000001473 46400 1727204576.23908: WORKER PROCESS EXITING 46400 1727204576.24852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.26604: done with get_vars() 46400 1727204576.26634: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.062) 0:01:06.551 ***** 46400 1727204576.26739: entering _queue_task() for managed-node2/ping 46400 1727204576.27175: worker is 1 (out of 1 available) 46400 1727204576.27189: exiting _queue_task() for managed-node2/ping 46400 1727204576.27203: done queuing things up, now waiting for results queue to drain 46400 1727204576.27205: waiting for pending results... 46400 1727204576.27611: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204576.27833: in run() - task 0affcd87-79f5-1303-fda8-000000001474 46400 1727204576.27858: variable 'ansible_search_path' from source: unknown 46400 1727204576.27875: variable 'ansible_search_path' from source: unknown 46400 1727204576.27921: calling self._execute() 46400 1727204576.28029: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.28040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.28063: variable 'omit' from source: magic vars 46400 1727204576.28590: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.28615: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.28634: variable 'omit' from source: magic vars 46400 1727204576.28725: variable 'omit' from source: magic vars 46400 1727204576.28782: variable 'omit' from source: magic vars 46400 1727204576.28843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204576.28913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204576.28950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204576.28996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.29021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204576.29066: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204576.29089: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.29107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.29239: Set connection var ansible_shell_type to sh 46400 1727204576.29261: Set connection var ansible_shell_executable to /bin/sh 46400 1727204576.29280: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204576.29294: Set connection var ansible_connection to ssh 46400 1727204576.29308: Set connection var ansible_pipelining to False 46400 1727204576.29321: Set connection var ansible_timeout to 10 46400 1727204576.29345: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.29351: variable 'ansible_connection' from source: unknown 46400 1727204576.29356: variable 'ansible_module_compression' from source: unknown 46400 1727204576.29361: variable 'ansible_shell_type' from source: unknown 46400 1727204576.29368: variable 'ansible_shell_executable' from source: unknown 46400 1727204576.29374: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.29380: variable 'ansible_pipelining' from source: unknown 46400 1727204576.29388: variable 'ansible_timeout' from source: unknown 46400 1727204576.29399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.29699: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204576.29722: variable 'omit' from source: magic vars 46400 1727204576.29736: starting attempt loop 46400 1727204576.29747: running the handler 46400 1727204576.29784: _low_level_execute_command(): starting 46400 1727204576.29806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204576.30769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.30785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.30801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.30820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.30877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.30890: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.30904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.30930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.30946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.30968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.30984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.30999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.31019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.31038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.31055: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.31094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.31173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.31195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.31210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.31308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.32949: stdout chunk (state=3): >>>/root <<< 46400 1727204576.33054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204576.33139: stderr chunk (state=3): >>><<< 46400 1727204576.33142: stdout chunk (state=3): >>><<< 46400 1727204576.33308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204576.33312: _low_level_execute_command(): starting 46400 1727204576.33318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569 `" && echo ansible-tmp-1727204576.3317566-50886-101732674428569="` echo /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569 `" ) && sleep 0' 46400 1727204576.34096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.34119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.34145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.34177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.34232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.34246: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.34267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.34299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.34317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.34339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.34357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.34382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.34409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.34423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.34440: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.34455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.34568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.34596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.34619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.34701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.36594: stdout chunk (state=3): >>>ansible-tmp-1727204576.3317566-50886-101732674428569=/root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569 <<< 46400 1727204576.36871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204576.36875: stdout chunk (state=3): >>><<< 46400 1727204576.36881: stderr chunk (state=3): >>><<< 46400 1727204576.36884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204576.3317566-50886-101732674428569=/root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204576.36975: variable 'ansible_module_compression' from source: unknown 46400 1727204576.36979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204576.37070: variable 'ansible_facts' from source: unknown 46400 1727204576.37095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/AnsiballZ_ping.py 46400 1727204576.37269: Sending initial data 46400 1727204576.37275: Sent initial data (153 bytes) 46400 1727204576.38363: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.38382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.38399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.38414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.38456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.38475: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.38491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.38518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.38535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.38546: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.38558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.38576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.38590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.38599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.38612: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.38623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.38701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.38726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.38745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.38906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.40615: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204576.40656: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204576.40701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp0juj13r4 /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/AnsiballZ_ping.py <<< 46400 1727204576.40735: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204576.41974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204576.42183: stderr chunk (state=3): >>><<< 46400 1727204576.42190: stdout chunk (state=3): >>><<< 46400 1727204576.42195: done transferring module to remote 46400 1727204576.42198: _low_level_execute_command(): starting 46400 1727204576.42211: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/ /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/AnsiballZ_ping.py && sleep 0' 46400 1727204576.42937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.43204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.43220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.43237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.43286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.43299: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.43313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.43328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.43338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.43348: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.43358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.43382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.43397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.43407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.43417: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.43428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.43507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.43524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.43539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.43783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.45419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204576.45423: stdout chunk (state=3): >>><<< 46400 1727204576.45425: stderr chunk (state=3): >>><<< 46400 1727204576.45520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204576.45526: _low_level_execute_command(): starting 46400 1727204576.45529: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/AnsiballZ_ping.py && sleep 0' 46400 1727204576.46278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.46294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.46311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.46328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.46372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.46385: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.46399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.46422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.46433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.46445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.46467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.46576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.47016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.47029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.47041: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.47056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.47135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.47151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.47168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.47340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.60157: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204576.61171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204576.61287: stderr chunk (state=3): >>><<< 46400 1727204576.61291: stdout chunk (state=3): >>><<< 46400 1727204576.61571: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204576.61575: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204576.61583: _low_level_execute_command(): starting 46400 1727204576.61586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204576.3317566-50886-101732674428569/ > /dev/null 2>&1 && sleep 0' 46400 1727204576.62643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204576.62862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.62890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.62915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.62975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.62995: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204576.63021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.63040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204576.63052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204576.63074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204576.63100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204576.63129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204576.63168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204576.63188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204576.63204: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204576.63260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204576.63349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204576.63510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204576.63525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204576.63615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204576.65422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204576.65522: stderr chunk (state=3): >>><<< 46400 1727204576.65572: stdout chunk (state=3): >>><<< 46400 1727204576.65605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204576.65612: handler run complete 46400 1727204576.65633: attempt loop complete, returning result 46400 1727204576.65643: _execute() done 46400 1727204576.65649: dumping result to json 46400 1727204576.65682: done dumping result, returning 46400 1727204576.65692: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000001474] 46400 1727204576.65698: sending task result for task 0affcd87-79f5-1303-fda8-000000001474 46400 1727204576.65831: done sending task result for task 0affcd87-79f5-1303-fda8-000000001474 46400 1727204576.65834: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204576.65954: no more pending results, returning what we have 46400 1727204576.65958: results queue empty 46400 1727204576.65959: checking for any_errors_fatal 46400 1727204576.65971: done checking for any_errors_fatal 46400 1727204576.65972: checking for max_fail_percentage 46400 1727204576.65974: done checking for max_fail_percentage 46400 1727204576.65976: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.65977: done checking to see if all hosts have failed 46400 1727204576.65977: getting the remaining hosts for this loop 46400 1727204576.65979: done getting the remaining hosts for this loop 46400 1727204576.65983: getting the next task for host managed-node2 46400 1727204576.66017: done getting next task for host managed-node2 46400 1727204576.66027: ^ task is: TASK: meta (role_complete) 46400 1727204576.66040: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.66055: getting variables 46400 1727204576.66057: in VariableManager get_vars() 46400 1727204576.66111: Calling all_inventory to load vars for managed-node2 46400 1727204576.66116: Calling groups_inventory to load vars for managed-node2 46400 1727204576.66118: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.66129: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.66132: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.66135: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.68829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.70778: done with get_vars() 46400 1727204576.70802: done getting variables 46400 1727204576.70893: done queuing things up, now waiting for results queue to drain 46400 1727204576.70896: results queue empty 46400 1727204576.70896: checking for any_errors_fatal 46400 1727204576.70899: done checking for any_errors_fatal 46400 1727204576.70900: checking for max_fail_percentage 46400 1727204576.70901: done checking for max_fail_percentage 46400 1727204576.70902: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.70903: done checking to see if all hosts have failed 46400 1727204576.70903: getting the remaining hosts for this loop 46400 1727204576.70904: done getting the remaining hosts for this loop 46400 1727204576.70911: getting the next task for host managed-node2 46400 1727204576.70916: done getting next task for host managed-node2 46400 1727204576.70918: ^ task is: TASK: Asserts 46400 1727204576.70920: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.70923: getting variables 46400 1727204576.70924: in VariableManager get_vars() 46400 1727204576.70936: Calling all_inventory to load vars for managed-node2 46400 1727204576.70938: Calling groups_inventory to load vars for managed-node2 46400 1727204576.70940: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.70945: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.70947: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.70949: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.72421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.74188: done with get_vars() 46400 1727204576.74215: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.475) 0:01:07.027 ***** 46400 1727204576.74293: entering _queue_task() for managed-node2/include_tasks 46400 1727204576.75322: worker is 1 (out of 1 available) 46400 1727204576.75355: exiting _queue_task() for managed-node2/include_tasks 46400 1727204576.75373: done queuing things up, now waiting for results queue to drain 46400 1727204576.75375: waiting for pending results... 46400 1727204576.75686: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204576.75828: in run() - task 0affcd87-79f5-1303-fda8-00000000100a 46400 1727204576.75847: variable 'ansible_search_path' from source: unknown 46400 1727204576.75856: variable 'ansible_search_path' from source: unknown 46400 1727204576.75909: variable 'lsr_assert' from source: include params 46400 1727204576.76121: variable 'lsr_assert' from source: include params 46400 1727204576.76202: variable 'omit' from source: magic vars 46400 1727204576.76345: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.76364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.76382: variable 'omit' from source: magic vars 46400 1727204576.76619: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.76633: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.76643: variable 'item' from source: unknown 46400 1727204576.76714: variable 'item' from source: unknown 46400 1727204576.76752: variable 'item' from source: unknown 46400 1727204576.76820: variable 'item' from source: unknown 46400 1727204576.77000: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.77012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.77026: variable 'omit' from source: magic vars 46400 1727204576.77254: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.77269: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.77285: variable 'item' from source: unknown 46400 1727204576.77348: variable 'item' from source: unknown 46400 1727204576.77387: variable 'item' from source: unknown 46400 1727204576.77449: variable 'item' from source: unknown 46400 1727204576.77543: dumping result to json 46400 1727204576.77552: done dumping result, returning 46400 1727204576.77567: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-00000000100a] 46400 1727204576.77580: sending task result for task 0affcd87-79f5-1303-fda8-00000000100a 46400 1727204576.77671: no more pending results, returning what we have 46400 1727204576.77679: in VariableManager get_vars() 46400 1727204576.77725: Calling all_inventory to load vars for managed-node2 46400 1727204576.77728: Calling groups_inventory to load vars for managed-node2 46400 1727204576.77732: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.77746: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.77751: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.77755: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.78785: done sending task result for task 0affcd87-79f5-1303-fda8-00000000100a 46400 1727204576.78788: WORKER PROCESS EXITING 46400 1727204576.79925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.82766: done with get_vars() 46400 1727204576.82800: variable 'ansible_search_path' from source: unknown 46400 1727204576.82802: variable 'ansible_search_path' from source: unknown 46400 1727204576.82845: variable 'ansible_search_path' from source: unknown 46400 1727204576.82847: variable 'ansible_search_path' from source: unknown 46400 1727204576.82879: we have included files to process 46400 1727204576.82881: generating all_blocks data 46400 1727204576.82883: done generating all_blocks data 46400 1727204576.82889: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204576.82890: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204576.82897: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 46400 1727204576.83018: in VariableManager get_vars() 46400 1727204576.83040: done with get_vars() 46400 1727204576.83161: done processing included file 46400 1727204576.83166: iterating over new_blocks loaded from include file 46400 1727204576.83168: in VariableManager get_vars() 46400 1727204576.83183: done with get_vars() 46400 1727204576.83185: filtering new block on tags 46400 1727204576.83225: done filtering new block on tags 46400 1727204576.83227: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item=tasks/assert_device_present.yml) 46400 1727204576.83233: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204576.83234: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204576.83240: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204576.84171: in VariableManager get_vars() 46400 1727204576.84192: done with get_vars() 46400 1727204576.85321: done processing included file 46400 1727204576.85324: iterating over new_blocks loaded from include file 46400 1727204576.85325: in VariableManager get_vars() 46400 1727204576.85381: done with get_vars() 46400 1727204576.85383: filtering new block on tags 46400 1727204576.85416: done filtering new block on tags 46400 1727204576.85419: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 46400 1727204576.85519: extending task lists for all hosts with included blocks 46400 1727204576.88011: done extending task lists 46400 1727204576.88013: done processing included files 46400 1727204576.88014: results queue empty 46400 1727204576.88015: checking for any_errors_fatal 46400 1727204576.88017: done checking for any_errors_fatal 46400 1727204576.88018: checking for max_fail_percentage 46400 1727204576.88019: done checking for max_fail_percentage 46400 1727204576.88020: checking to see if all hosts have failed and the running result is not ok 46400 1727204576.88021: done checking to see if all hosts have failed 46400 1727204576.88022: getting the remaining hosts for this loop 46400 1727204576.88023: done getting the remaining hosts for this loop 46400 1727204576.88026: getting the next task for host managed-node2 46400 1727204576.88030: done getting next task for host managed-node2 46400 1727204576.88033: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204576.88036: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204576.88045: getting variables 46400 1727204576.88046: in VariableManager get_vars() 46400 1727204576.88060: Calling all_inventory to load vars for managed-node2 46400 1727204576.88063: Calling groups_inventory to load vars for managed-node2 46400 1727204576.88067: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.88074: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.88077: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.88080: Calling groups_plugins_play to load vars for managed-node2 46400 1727204576.90589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204576.94262: done with get_vars() 46400 1727204576.94296: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:02:56 -0400 (0:00:00.208) 0:01:07.236 ***** 46400 1727204576.95184: entering _queue_task() for managed-node2/include_tasks 46400 1727204576.95942: worker is 1 (out of 1 available) 46400 1727204576.95956: exiting _queue_task() for managed-node2/include_tasks 46400 1727204576.95973: done queuing things up, now waiting for results queue to drain 46400 1727204576.95975: waiting for pending results... 46400 1727204576.97209: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204576.97454: in run() - task 0affcd87-79f5-1303-fda8-0000000015cf 46400 1727204576.97565: variable 'ansible_search_path' from source: unknown 46400 1727204576.97574: variable 'ansible_search_path' from source: unknown 46400 1727204576.97620: calling self._execute() 46400 1727204576.97833: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204576.97846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204576.97861: variable 'omit' from source: magic vars 46400 1727204576.98591: variable 'ansible_distribution_major_version' from source: facts 46400 1727204576.98685: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204576.98752: _execute() done 46400 1727204576.98761: dumping result to json 46400 1727204576.98771: done dumping result, returning 46400 1727204576.98780: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-0000000015cf] 46400 1727204576.98790: sending task result for task 0affcd87-79f5-1303-fda8-0000000015cf 46400 1727204576.98929: no more pending results, returning what we have 46400 1727204576.98935: in VariableManager get_vars() 46400 1727204576.98986: Calling all_inventory to load vars for managed-node2 46400 1727204576.98990: Calling groups_inventory to load vars for managed-node2 46400 1727204576.98994: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204576.99009: Calling all_plugins_play to load vars for managed-node2 46400 1727204576.99013: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204576.99016: Calling groups_plugins_play to load vars for managed-node2 46400 1727204577.00427: done sending task result for task 0affcd87-79f5-1303-fda8-0000000015cf 46400 1727204577.00431: WORKER PROCESS EXITING 46400 1727204577.14347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204577.16875: done with get_vars() 46400 1727204577.16906: variable 'ansible_search_path' from source: unknown 46400 1727204577.16907: variable 'ansible_search_path' from source: unknown 46400 1727204577.16917: variable 'item' from source: include params 46400 1727204577.17011: variable 'item' from source: include params 46400 1727204577.17042: we have included files to process 46400 1727204577.17043: generating all_blocks data 46400 1727204577.17044: done generating all_blocks data 46400 1727204577.17045: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204577.17046: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204577.17048: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204577.17211: done processing included file 46400 1727204577.17213: iterating over new_blocks loaded from include file 46400 1727204577.17215: in VariableManager get_vars() 46400 1727204577.17231: done with get_vars() 46400 1727204577.17233: filtering new block on tags 46400 1727204577.17259: done filtering new block on tags 46400 1727204577.17262: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204577.17268: extending task lists for all hosts with included blocks 46400 1727204577.17435: done extending task lists 46400 1727204577.17436: done processing included files 46400 1727204577.17437: results queue empty 46400 1727204577.17438: checking for any_errors_fatal 46400 1727204577.17441: done checking for any_errors_fatal 46400 1727204577.17441: checking for max_fail_percentage 46400 1727204577.17443: done checking for max_fail_percentage 46400 1727204577.17444: checking to see if all hosts have failed and the running result is not ok 46400 1727204577.17445: done checking to see if all hosts have failed 46400 1727204577.17446: getting the remaining hosts for this loop 46400 1727204577.17447: done getting the remaining hosts for this loop 46400 1727204577.17449: getting the next task for host managed-node2 46400 1727204577.17453: done getting next task for host managed-node2 46400 1727204577.17455: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204577.17458: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204577.17460: getting variables 46400 1727204577.17461: in VariableManager get_vars() 46400 1727204577.17473: Calling all_inventory to load vars for managed-node2 46400 1727204577.17475: Calling groups_inventory to load vars for managed-node2 46400 1727204577.17478: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204577.17482: Calling all_plugins_play to load vars for managed-node2 46400 1727204577.17485: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204577.17487: Calling groups_plugins_play to load vars for managed-node2 46400 1727204577.18992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204577.21145: done with get_vars() 46400 1727204577.21179: done getting variables 46400 1727204577.21322: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.261) 0:01:07.498 ***** 46400 1727204577.21351: entering _queue_task() for managed-node2/stat 46400 1727204577.21766: worker is 1 (out of 1 available) 46400 1727204577.21781: exiting _queue_task() for managed-node2/stat 46400 1727204577.21796: done queuing things up, now waiting for results queue to drain 46400 1727204577.21797: waiting for pending results... 46400 1727204577.22384: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204577.22635: in run() - task 0affcd87-79f5-1303-fda8-000000001647 46400 1727204577.22659: variable 'ansible_search_path' from source: unknown 46400 1727204577.22670: variable 'ansible_search_path' from source: unknown 46400 1727204577.22713: calling self._execute() 46400 1727204577.22817: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.22841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.22945: variable 'omit' from source: magic vars 46400 1727204577.23836: variable 'ansible_distribution_major_version' from source: facts 46400 1727204577.23856: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204577.23870: variable 'omit' from source: magic vars 46400 1727204577.24047: variable 'omit' from source: magic vars 46400 1727204577.24204: variable 'interface' from source: play vars 46400 1727204577.24246: variable 'omit' from source: magic vars 46400 1727204577.24383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204577.24425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204577.24455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204577.24484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204577.24509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204577.24553: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204577.24567: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.24587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.24760: Set connection var ansible_shell_type to sh 46400 1727204577.24811: Set connection var ansible_shell_executable to /bin/sh 46400 1727204577.24822: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204577.24832: Set connection var ansible_connection to ssh 46400 1727204577.24917: Set connection var ansible_pipelining to False 46400 1727204577.24929: Set connection var ansible_timeout to 10 46400 1727204577.24961: variable 'ansible_shell_executable' from source: unknown 46400 1727204577.24974: variable 'ansible_connection' from source: unknown 46400 1727204577.24982: variable 'ansible_module_compression' from source: unknown 46400 1727204577.24989: variable 'ansible_shell_type' from source: unknown 46400 1727204577.24995: variable 'ansible_shell_executable' from source: unknown 46400 1727204577.25002: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.25011: variable 'ansible_pipelining' from source: unknown 46400 1727204577.25032: variable 'ansible_timeout' from source: unknown 46400 1727204577.25057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.25361: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204577.25380: variable 'omit' from source: magic vars 46400 1727204577.25395: starting attempt loop 46400 1727204577.25402: running the handler 46400 1727204577.25421: _low_level_execute_command(): starting 46400 1727204577.25433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204577.28477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.28492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.28504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.28519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.28560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.28578: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.28588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.28602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.28611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.28618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.28625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.28639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.28647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.28655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.28667: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.28681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.28750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.28776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.28793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.28893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.30585: stdout chunk (state=3): >>>/root <<< 46400 1727204577.30736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204577.30797: stderr chunk (state=3): >>><<< 46400 1727204577.30800: stdout chunk (state=3): >>><<< 46400 1727204577.30828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204577.30844: _low_level_execute_command(): starting 46400 1727204577.30853: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215 `" && echo ansible-tmp-1727204577.3082676-50943-172512552082215="` echo /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215 `" ) && sleep 0' 46400 1727204577.32314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.32346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.32382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.32424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.32505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.32527: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.32559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.32611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.32635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.32663: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.32698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.32727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.32754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.32787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.32820: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.32969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.33135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.33170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.33192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.33321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.35288: stdout chunk (state=3): >>>ansible-tmp-1727204577.3082676-50943-172512552082215=/root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215 <<< 46400 1727204577.35495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204577.35498: stdout chunk (state=3): >>><<< 46400 1727204577.35501: stderr chunk (state=3): >>><<< 46400 1727204577.35675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204577.3082676-50943-172512552082215=/root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204577.35679: variable 'ansible_module_compression' from source: unknown 46400 1727204577.35682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204577.35783: variable 'ansible_facts' from source: unknown 46400 1727204577.35804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/AnsiballZ_stat.py 46400 1727204577.36121: Sending initial data 46400 1727204577.36124: Sent initial data (153 bytes) 46400 1727204577.38152: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.38175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.38190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.38208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.38279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.38301: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.38336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.38367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.38382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.38397: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.38416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.38431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.38448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.38470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.38494: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.38537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.38632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.38656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.39205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.39492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.41334: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204577.41373: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204577.41410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpxh9ykmfa /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/AnsiballZ_stat.py <<< 46400 1727204577.41457: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204577.42731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204577.42953: stderr chunk (state=3): >>><<< 46400 1727204577.42956: stdout chunk (state=3): >>><<< 46400 1727204577.42958: done transferring module to remote 46400 1727204577.42960: _low_level_execute_command(): starting 46400 1727204577.42978: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/ /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/AnsiballZ_stat.py && sleep 0' 46400 1727204577.43736: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.43750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.43799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.43841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.43912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.43928: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.43945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.43976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.43990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.44011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.44025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.44043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.44070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.44086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.44098: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.44112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.44210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.44232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.44278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.44387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.46228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204577.46240: stderr chunk (state=3): >>><<< 46400 1727204577.46248: stdout chunk (state=3): >>><<< 46400 1727204577.46350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204577.46354: _low_level_execute_command(): starting 46400 1727204577.46357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/AnsiballZ_stat.py && sleep 0' 46400 1727204577.47080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.47123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.47140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.47155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.47252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.47266: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.47282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.47297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.47306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.47314: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.47336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.47351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.47369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.47380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.47396: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.47407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.47567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.47595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.47617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.47777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.61083: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32812, "dev": 21, "nlink": 1, "atime": 1727204561.1846428, "mtime": 1727204561.1846428, "ctime": 1727204561.1846428, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204577.62286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204577.62291: stdout chunk (state=3): >>><<< 46400 1727204577.62293: stderr chunk (state=3): >>><<< 46400 1727204577.62371: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32812, "dev": 21, "nlink": 1, "atime": 1727204561.1846428, "mtime": 1727204561.1846428, "ctime": 1727204561.1846428, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204577.62481: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204577.62486: _low_level_execute_command(): starting 46400 1727204577.62490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204577.3082676-50943-172512552082215/ > /dev/null 2>&1 && sleep 0' 46400 1727204577.63088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204577.63103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.63119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.63136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.63182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.63194: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204577.63207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.63223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204577.63234: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204577.63245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204577.63256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204577.63274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204577.63290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204577.63302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204577.63313: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204577.63327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204577.63404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204577.63421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204577.63435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204577.63507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204577.65431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204577.65435: stdout chunk (state=3): >>><<< 46400 1727204577.65437: stderr chunk (state=3): >>><<< 46400 1727204577.65787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204577.65791: handler run complete 46400 1727204577.65793: attempt loop complete, returning result 46400 1727204577.65796: _execute() done 46400 1727204577.65798: dumping result to json 46400 1727204577.65800: done dumping result, returning 46400 1727204577.65802: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000001647] 46400 1727204577.65804: sending task result for task 0affcd87-79f5-1303-fda8-000000001647 ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204561.1846428, "block_size": 4096, "blocks": 0, "ctime": 1727204561.1846428, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32812, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204561.1846428, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 46400 1727204577.65978: no more pending results, returning what we have 46400 1727204577.65982: results queue empty 46400 1727204577.65982: checking for any_errors_fatal 46400 1727204577.65985: done checking for any_errors_fatal 46400 1727204577.65986: checking for max_fail_percentage 46400 1727204577.65988: done checking for max_fail_percentage 46400 1727204577.65988: checking to see if all hosts have failed and the running result is not ok 46400 1727204577.65989: done checking to see if all hosts have failed 46400 1727204577.65990: getting the remaining hosts for this loop 46400 1727204577.65991: done getting the remaining hosts for this loop 46400 1727204577.65995: getting the next task for host managed-node2 46400 1727204577.66003: done getting next task for host managed-node2 46400 1727204577.66006: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 46400 1727204577.66008: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204577.66013: getting variables 46400 1727204577.66018: in VariableManager get_vars() 46400 1727204577.66051: Calling all_inventory to load vars for managed-node2 46400 1727204577.66054: Calling groups_inventory to load vars for managed-node2 46400 1727204577.66057: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204577.66069: Calling all_plugins_play to load vars for managed-node2 46400 1727204577.66072: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204577.66074: Calling groups_plugins_play to load vars for managed-node2 46400 1727204577.66615: done sending task result for task 0affcd87-79f5-1303-fda8-000000001647 46400 1727204577.66620: WORKER PROCESS EXITING 46400 1727204577.69328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204577.72055: done with get_vars() 46400 1727204577.72099: done getting variables 46400 1727204577.72169: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204577.72376: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.510) 0:01:08.008 ***** 46400 1727204577.72408: entering _queue_task() for managed-node2/assert 46400 1727204577.72914: worker is 1 (out of 1 available) 46400 1727204577.72936: exiting _queue_task() for managed-node2/assert 46400 1727204577.72952: done queuing things up, now waiting for results queue to drain 46400 1727204577.72953: waiting for pending results... 46400 1727204577.74142: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 46400 1727204577.74489: in run() - task 0affcd87-79f5-1303-fda8-0000000015d0 46400 1727204577.74522: variable 'ansible_search_path' from source: unknown 46400 1727204577.74538: variable 'ansible_search_path' from source: unknown 46400 1727204577.74586: calling self._execute() 46400 1727204577.74841: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.74971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.74988: variable 'omit' from source: magic vars 46400 1727204577.75595: variable 'ansible_distribution_major_version' from source: facts 46400 1727204577.75640: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204577.75651: variable 'omit' from source: magic vars 46400 1727204577.75708: variable 'omit' from source: magic vars 46400 1727204577.75815: variable 'interface' from source: play vars 46400 1727204577.75842: variable 'omit' from source: magic vars 46400 1727204577.75892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204577.75936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204577.75968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204577.75993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204577.76008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204577.76048: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204577.76056: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.76069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.76177: Set connection var ansible_shell_type to sh 46400 1727204577.76192: Set connection var ansible_shell_executable to /bin/sh 46400 1727204577.76201: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204577.76211: Set connection var ansible_connection to ssh 46400 1727204577.76221: Set connection var ansible_pipelining to False 46400 1727204577.76231: Set connection var ansible_timeout to 10 46400 1727204577.76272: variable 'ansible_shell_executable' from source: unknown 46400 1727204577.76281: variable 'ansible_connection' from source: unknown 46400 1727204577.76288: variable 'ansible_module_compression' from source: unknown 46400 1727204577.76295: variable 'ansible_shell_type' from source: unknown 46400 1727204577.76302: variable 'ansible_shell_executable' from source: unknown 46400 1727204577.76308: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.76315: variable 'ansible_pipelining' from source: unknown 46400 1727204577.76321: variable 'ansible_timeout' from source: unknown 46400 1727204577.76328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.76521: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204577.76538: variable 'omit' from source: magic vars 46400 1727204577.76553: starting attempt loop 46400 1727204577.76586: running the handler 46400 1727204577.77566: variable 'interface_stat' from source: set_fact 46400 1727204577.77643: Evaluated conditional (interface_stat.stat.exists): True 46400 1727204577.77654: handler run complete 46400 1727204577.77681: attempt loop complete, returning result 46400 1727204577.77689: _execute() done 46400 1727204577.77696: dumping result to json 46400 1727204577.77703: done dumping result, returning 46400 1727204577.77719: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [0affcd87-79f5-1303-fda8-0000000015d0] 46400 1727204577.77730: sending task result for task 0affcd87-79f5-1303-fda8-0000000015d0 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204577.77968: no more pending results, returning what we have 46400 1727204577.77973: results queue empty 46400 1727204577.77975: checking for any_errors_fatal 46400 1727204577.77994: done checking for any_errors_fatal 46400 1727204577.77995: checking for max_fail_percentage 46400 1727204577.78004: done checking for max_fail_percentage 46400 1727204577.78006: checking to see if all hosts have failed and the running result is not ok 46400 1727204577.78007: done checking to see if all hosts have failed 46400 1727204577.78007: getting the remaining hosts for this loop 46400 1727204577.78010: done getting the remaining hosts for this loop 46400 1727204577.78014: getting the next task for host managed-node2 46400 1727204577.78043: done getting next task for host managed-node2 46400 1727204577.78048: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204577.78052: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204577.78075: getting variables 46400 1727204577.78078: in VariableManager get_vars() 46400 1727204577.78157: Calling all_inventory to load vars for managed-node2 46400 1727204577.78162: Calling groups_inventory to load vars for managed-node2 46400 1727204577.78174: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204577.78192: Calling all_plugins_play to load vars for managed-node2 46400 1727204577.78195: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204577.78199: Calling groups_plugins_play to load vars for managed-node2 46400 1727204577.78731: done sending task result for task 0affcd87-79f5-1303-fda8-0000000015d0 46400 1727204577.78735: WORKER PROCESS EXITING 46400 1727204577.81656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204577.84825: done with get_vars() 46400 1727204577.84884: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:02:57 -0400 (0:00:00.127) 0:01:08.136 ***** 46400 1727204577.85169: entering _queue_task() for managed-node2/include_tasks 46400 1727204577.85734: worker is 1 (out of 1 available) 46400 1727204577.85747: exiting _queue_task() for managed-node2/include_tasks 46400 1727204577.85761: done queuing things up, now waiting for results queue to drain 46400 1727204577.85787: waiting for pending results... 46400 1727204577.86099: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204577.86250: in run() - task 0affcd87-79f5-1303-fda8-0000000015d4 46400 1727204577.86267: variable 'ansible_search_path' from source: unknown 46400 1727204577.86271: variable 'ansible_search_path' from source: unknown 46400 1727204577.86305: calling self._execute() 46400 1727204577.86399: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204577.86406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204577.86416: variable 'omit' from source: magic vars 46400 1727204577.87084: variable 'ansible_distribution_major_version' from source: facts 46400 1727204577.87096: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204577.87103: _execute() done 46400 1727204577.87106: dumping result to json 46400 1727204577.87113: done dumping result, returning 46400 1727204577.87121: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-0000000015d4] 46400 1727204577.87128: sending task result for task 0affcd87-79f5-1303-fda8-0000000015d4 46400 1727204577.87248: done sending task result for task 0affcd87-79f5-1303-fda8-0000000015d4 46400 1727204577.87252: WORKER PROCESS EXITING 46400 1727204577.87282: no more pending results, returning what we have 46400 1727204577.87287: in VariableManager get_vars() 46400 1727204577.87339: Calling all_inventory to load vars for managed-node2 46400 1727204577.87342: Calling groups_inventory to load vars for managed-node2 46400 1727204577.87353: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204577.87370: Calling all_plugins_play to load vars for managed-node2 46400 1727204577.87374: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204577.87377: Calling groups_plugins_play to load vars for managed-node2 46400 1727204577.91749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204577.98096: done with get_vars() 46400 1727204577.98121: variable 'ansible_search_path' from source: unknown 46400 1727204577.98123: variable 'ansible_search_path' from source: unknown 46400 1727204577.98133: variable 'item' from source: include params 46400 1727204577.98246: variable 'item' from source: include params 46400 1727204577.98285: we have included files to process 46400 1727204577.98286: generating all_blocks data 46400 1727204577.98288: done generating all_blocks data 46400 1727204577.98294: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204577.98295: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204577.98298: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204577.99947: done processing included file 46400 1727204577.99949: iterating over new_blocks loaded from include file 46400 1727204577.99950: in VariableManager get_vars() 46400 1727204578.00675: done with get_vars() 46400 1727204578.00677: filtering new block on tags 46400 1727204578.00757: done filtering new block on tags 46400 1727204578.00760: in VariableManager get_vars() 46400 1727204578.00782: done with get_vars() 46400 1727204578.00784: filtering new block on tags 46400 1727204578.00846: done filtering new block on tags 46400 1727204578.00849: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204578.00855: extending task lists for all hosts with included blocks 46400 1727204578.01126: done extending task lists 46400 1727204578.01128: done processing included files 46400 1727204578.01129: results queue empty 46400 1727204578.01130: checking for any_errors_fatal 46400 1727204578.01133: done checking for any_errors_fatal 46400 1727204578.01133: checking for max_fail_percentage 46400 1727204578.01134: done checking for max_fail_percentage 46400 1727204578.01135: checking to see if all hosts have failed and the running result is not ok 46400 1727204578.01136: done checking to see if all hosts have failed 46400 1727204578.01137: getting the remaining hosts for this loop 46400 1727204578.01138: done getting the remaining hosts for this loop 46400 1727204578.01141: getting the next task for host managed-node2 46400 1727204578.01146: done getting next task for host managed-node2 46400 1727204578.01148: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204578.01151: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204578.01153: getting variables 46400 1727204578.01155: in VariableManager get_vars() 46400 1727204578.01167: Calling all_inventory to load vars for managed-node2 46400 1727204578.01170: Calling groups_inventory to load vars for managed-node2 46400 1727204578.01172: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204578.01179: Calling all_plugins_play to load vars for managed-node2 46400 1727204578.01181: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204578.01184: Calling groups_plugins_play to load vars for managed-node2 46400 1727204578.03777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204578.07692: done with get_vars() 46400 1727204578.07730: done getting variables 46400 1727204578.07782: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.226) 0:01:08.362 ***** 46400 1727204578.07816: entering _queue_task() for managed-node2/set_fact 46400 1727204578.08161: worker is 1 (out of 1 available) 46400 1727204578.08176: exiting _queue_task() for managed-node2/set_fact 46400 1727204578.08189: done queuing things up, now waiting for results queue to drain 46400 1727204578.08190: waiting for pending results... 46400 1727204578.08798: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204578.09328: in run() - task 0affcd87-79f5-1303-fda8-000000001665 46400 1727204578.09401: variable 'ansible_search_path' from source: unknown 46400 1727204578.09413: variable 'ansible_search_path' from source: unknown 46400 1727204578.09459: calling self._execute() 46400 1727204578.09823: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.09840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.09859: variable 'omit' from source: magic vars 46400 1727204578.10737: variable 'ansible_distribution_major_version' from source: facts 46400 1727204578.10756: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204578.10775: variable 'omit' from source: magic vars 46400 1727204578.10867: variable 'omit' from source: magic vars 46400 1727204578.10984: variable 'omit' from source: magic vars 46400 1727204578.11036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204578.11088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204578.11116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204578.11140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.11178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.11219: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204578.11228: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.11236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.11454: Set connection var ansible_shell_type to sh 46400 1727204578.11503: Set connection var ansible_shell_executable to /bin/sh 46400 1727204578.11515: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204578.11528: Set connection var ansible_connection to ssh 46400 1727204578.11543: Set connection var ansible_pipelining to False 46400 1727204578.11554: Set connection var ansible_timeout to 10 46400 1727204578.11588: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.11604: variable 'ansible_connection' from source: unknown 46400 1727204578.11612: variable 'ansible_module_compression' from source: unknown 46400 1727204578.11618: variable 'ansible_shell_type' from source: unknown 46400 1727204578.11624: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.11630: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.11637: variable 'ansible_pipelining' from source: unknown 46400 1727204578.11644: variable 'ansible_timeout' from source: unknown 46400 1727204578.11651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.11824: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204578.11842: variable 'omit' from source: magic vars 46400 1727204578.11852: starting attempt loop 46400 1727204578.11858: running the handler 46400 1727204578.11881: handler run complete 46400 1727204578.11896: attempt loop complete, returning result 46400 1727204578.11902: _execute() done 46400 1727204578.11907: dumping result to json 46400 1727204578.11913: done dumping result, returning 46400 1727204578.11930: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-000000001665] 46400 1727204578.11939: sending task result for task 0affcd87-79f5-1303-fda8-000000001665 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204578.12112: no more pending results, returning what we have 46400 1727204578.12117: results queue empty 46400 1727204578.12118: checking for any_errors_fatal 46400 1727204578.12120: done checking for any_errors_fatal 46400 1727204578.12120: checking for max_fail_percentage 46400 1727204578.12122: done checking for max_fail_percentage 46400 1727204578.12123: checking to see if all hosts have failed and the running result is not ok 46400 1727204578.12124: done checking to see if all hosts have failed 46400 1727204578.12125: getting the remaining hosts for this loop 46400 1727204578.12127: done getting the remaining hosts for this loop 46400 1727204578.12130: getting the next task for host managed-node2 46400 1727204578.12139: done getting next task for host managed-node2 46400 1727204578.12142: ^ task is: TASK: Stat profile file 46400 1727204578.12148: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204578.12153: getting variables 46400 1727204578.12154: in VariableManager get_vars() 46400 1727204578.12196: Calling all_inventory to load vars for managed-node2 46400 1727204578.12199: Calling groups_inventory to load vars for managed-node2 46400 1727204578.12203: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204578.12216: Calling all_plugins_play to load vars for managed-node2 46400 1727204578.12218: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204578.12221: Calling groups_plugins_play to load vars for managed-node2 46400 1727204578.12771: done sending task result for task 0affcd87-79f5-1303-fda8-000000001665 46400 1727204578.12774: WORKER PROCESS EXITING 46400 1727204578.15395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204578.19416: done with get_vars() 46400 1727204578.19454: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.117) 0:01:08.480 ***** 46400 1727204578.19562: entering _queue_task() for managed-node2/stat 46400 1727204578.21008: worker is 1 (out of 1 available) 46400 1727204578.21017: exiting _queue_task() for managed-node2/stat 46400 1727204578.21027: done queuing things up, now waiting for results queue to drain 46400 1727204578.21028: waiting for pending results... 46400 1727204578.21104: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204578.21236: in run() - task 0affcd87-79f5-1303-fda8-000000001666 46400 1727204578.21251: variable 'ansible_search_path' from source: unknown 46400 1727204578.21354: variable 'ansible_search_path' from source: unknown 46400 1727204578.21399: calling self._execute() 46400 1727204578.21550: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.21554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.21566: variable 'omit' from source: magic vars 46400 1727204578.23591: variable 'ansible_distribution_major_version' from source: facts 46400 1727204578.23604: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204578.23611: variable 'omit' from source: magic vars 46400 1727204578.23810: variable 'omit' from source: magic vars 46400 1727204578.24035: variable 'profile' from source: play vars 46400 1727204578.24040: variable 'interface' from source: play vars 46400 1727204578.24265: variable 'interface' from source: play vars 46400 1727204578.24293: variable 'omit' from source: magic vars 46400 1727204578.24339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204578.24484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204578.24506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204578.24525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.24537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.24686: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204578.24690: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.24694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.24924: Set connection var ansible_shell_type to sh 46400 1727204578.24935: Set connection var ansible_shell_executable to /bin/sh 46400 1727204578.24940: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204578.24946: Set connection var ansible_connection to ssh 46400 1727204578.24952: Set connection var ansible_pipelining to False 46400 1727204578.24958: Set connection var ansible_timeout to 10 46400 1727204578.25122: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.25125: variable 'ansible_connection' from source: unknown 46400 1727204578.25128: variable 'ansible_module_compression' from source: unknown 46400 1727204578.25130: variable 'ansible_shell_type' from source: unknown 46400 1727204578.25133: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.25135: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.25137: variable 'ansible_pipelining' from source: unknown 46400 1727204578.25140: variable 'ansible_timeout' from source: unknown 46400 1727204578.25145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.25597: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204578.25607: variable 'omit' from source: magic vars 46400 1727204578.25613: starting attempt loop 46400 1727204578.25616: running the handler 46400 1727204578.25630: _low_level_execute_command(): starting 46400 1727204578.25638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204578.27817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.27827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.27970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.27974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.28059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.28069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.28152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.28184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.28277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.30122: stdout chunk (state=3): >>>/root <<< 46400 1727204578.30126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.30128: stdout chunk (state=3): >>><<< 46400 1727204578.30131: stderr chunk (state=3): >>><<< 46400 1727204578.30263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.30269: _low_level_execute_command(): starting 46400 1727204578.30273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454 `" && echo ansible-tmp-1727204578.3015816-51065-82529476477454="` echo /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454 `" ) && sleep 0' 46400 1727204578.31104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.31117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.31135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.31154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.31208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.31219: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.31231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.31256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.31272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.31283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.31294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.31307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.31321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.31333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.31344: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.31369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.31444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.31471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.31487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.31600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.33501: stdout chunk (state=3): >>>ansible-tmp-1727204578.3015816-51065-82529476477454=/root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454 <<< 46400 1727204578.33710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.33715: stdout chunk (state=3): >>><<< 46400 1727204578.33718: stderr chunk (state=3): >>><<< 46400 1727204578.33770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204578.3015816-51065-82529476477454=/root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.33971: variable 'ansible_module_compression' from source: unknown 46400 1727204578.33974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204578.33976: variable 'ansible_facts' from source: unknown 46400 1727204578.34005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/AnsiballZ_stat.py 46400 1727204578.34554: Sending initial data 46400 1727204578.34559: Sent initial data (152 bytes) 46400 1727204578.36097: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.36101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.36142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.36145: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204578.36147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.36149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.36151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.36220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.36223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.36226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.36281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.38129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204578.38134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmprsxw0pyd /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/AnsiballZ_stat.py <<< 46400 1727204578.39505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.39673: stderr chunk (state=3): >>><<< 46400 1727204578.39677: stdout chunk (state=3): >>><<< 46400 1727204578.39771: done transferring module to remote 46400 1727204578.39775: _low_level_execute_command(): starting 46400 1727204578.39778: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/ /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/AnsiballZ_stat.py && sleep 0' 46400 1727204578.40697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.40705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.40716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.40729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.40789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.40796: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.40807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.40820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.40827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.40835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.40845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.40852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.40874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.40889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.40896: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.40906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.40988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.41005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.41017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.41086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.42883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.42888: stdout chunk (state=3): >>><<< 46400 1727204578.42893: stderr chunk (state=3): >>><<< 46400 1727204578.42909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.42912: _low_level_execute_command(): starting 46400 1727204578.42918: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/AnsiballZ_stat.py && sleep 0' 46400 1727204578.44320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.44329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.44340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.44353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.44394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.44404: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.44418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.44433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.44493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.44500: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.44508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.44519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.44535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.44542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.44549: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.44559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.44628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.44650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.44667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.44740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.57878: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204578.58842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.58887: stderr chunk (state=3): >>>Shared connection to 10.31.13.78 closed. <<< 46400 1727204578.58975: stderr chunk (state=3): >>><<< 46400 1727204578.58997: stdout chunk (state=3): >>><<< 46400 1727204578.59001: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204578.59053: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204578.59066: _low_level_execute_command(): starting 46400 1727204578.59069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204578.3015816-51065-82529476477454/ > /dev/null 2>&1 && sleep 0' 46400 1727204578.59874: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.59884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.59895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.59908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.59948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.59955: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.59971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.59988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.60010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.60013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.60015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.60023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.60035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.60052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.60058: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.60073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.60142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.60156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.60162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.60241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.62888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.62978: stderr chunk (state=3): >>><<< 46400 1727204578.62994: stdout chunk (state=3): >>><<< 46400 1727204578.63031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.63043: handler run complete 46400 1727204578.63063: attempt loop complete, returning result 46400 1727204578.63068: _execute() done 46400 1727204578.63071: dumping result to json 46400 1727204578.63073: done dumping result, returning 46400 1727204578.63082: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-000000001666] 46400 1727204578.63098: sending task result for task 0affcd87-79f5-1303-fda8-000000001666 46400 1727204578.63215: done sending task result for task 0affcd87-79f5-1303-fda8-000000001666 46400 1727204578.63218: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204578.63290: no more pending results, returning what we have 46400 1727204578.63295: results queue empty 46400 1727204578.63296: checking for any_errors_fatal 46400 1727204578.63305: done checking for any_errors_fatal 46400 1727204578.63305: checking for max_fail_percentage 46400 1727204578.63307: done checking for max_fail_percentage 46400 1727204578.63308: checking to see if all hosts have failed and the running result is not ok 46400 1727204578.63309: done checking to see if all hosts have failed 46400 1727204578.63310: getting the remaining hosts for this loop 46400 1727204578.63311: done getting the remaining hosts for this loop 46400 1727204578.63315: getting the next task for host managed-node2 46400 1727204578.63324: done getting next task for host managed-node2 46400 1727204578.63326: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204578.63334: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204578.63338: getting variables 46400 1727204578.63340: in VariableManager get_vars() 46400 1727204578.63383: Calling all_inventory to load vars for managed-node2 46400 1727204578.63386: Calling groups_inventory to load vars for managed-node2 46400 1727204578.63389: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204578.63400: Calling all_plugins_play to load vars for managed-node2 46400 1727204578.63402: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204578.63404: Calling groups_plugins_play to load vars for managed-node2 46400 1727204578.65958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204578.67816: done with get_vars() 46400 1727204578.67847: done getting variables 46400 1727204578.67920: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.483) 0:01:08.964 ***** 46400 1727204578.67958: entering _queue_task() for managed-node2/set_fact 46400 1727204578.68346: worker is 1 (out of 1 available) 46400 1727204578.68362: exiting _queue_task() for managed-node2/set_fact 46400 1727204578.68378: done queuing things up, now waiting for results queue to drain 46400 1727204578.68379: waiting for pending results... 46400 1727204578.68793: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204578.69005: in run() - task 0affcd87-79f5-1303-fda8-000000001667 46400 1727204578.69383: variable 'ansible_search_path' from source: unknown 46400 1727204578.69386: variable 'ansible_search_path' from source: unknown 46400 1727204578.69390: calling self._execute() 46400 1727204578.69410: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.69416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.69425: variable 'omit' from source: magic vars 46400 1727204578.69797: variable 'ansible_distribution_major_version' from source: facts 46400 1727204578.69808: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204578.69933: variable 'profile_stat' from source: set_fact 46400 1727204578.69942: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204578.69946: when evaluation is False, skipping this task 46400 1727204578.69948: _execute() done 46400 1727204578.69951: dumping result to json 46400 1727204578.69954: done dumping result, returning 46400 1727204578.69959: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-000000001667] 46400 1727204578.69966: sending task result for task 0affcd87-79f5-1303-fda8-000000001667 46400 1727204578.70066: done sending task result for task 0affcd87-79f5-1303-fda8-000000001667 46400 1727204578.70070: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204578.70122: no more pending results, returning what we have 46400 1727204578.70129: results queue empty 46400 1727204578.70133: checking for any_errors_fatal 46400 1727204578.70146: done checking for any_errors_fatal 46400 1727204578.70147: checking for max_fail_percentage 46400 1727204578.70149: done checking for max_fail_percentage 46400 1727204578.70150: checking to see if all hosts have failed and the running result is not ok 46400 1727204578.70151: done checking to see if all hosts have failed 46400 1727204578.70151: getting the remaining hosts for this loop 46400 1727204578.70153: done getting the remaining hosts for this loop 46400 1727204578.70158: getting the next task for host managed-node2 46400 1727204578.70171: done getting next task for host managed-node2 46400 1727204578.70173: ^ task is: TASK: Get NM profile info 46400 1727204578.70180: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204578.70185: getting variables 46400 1727204578.70187: in VariableManager get_vars() 46400 1727204578.70224: Calling all_inventory to load vars for managed-node2 46400 1727204578.70227: Calling groups_inventory to load vars for managed-node2 46400 1727204578.70230: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204578.70241: Calling all_plugins_play to load vars for managed-node2 46400 1727204578.70244: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204578.70246: Calling groups_plugins_play to load vars for managed-node2 46400 1727204578.74647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204578.77039: done with get_vars() 46400 1727204578.77078: done getting variables 46400 1727204578.77175: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:02:58 -0400 (0:00:00.092) 0:01:09.056 ***** 46400 1727204578.77217: entering _queue_task() for managed-node2/shell 46400 1727204578.79099: worker is 1 (out of 1 available) 46400 1727204578.79112: exiting _queue_task() for managed-node2/shell 46400 1727204578.79125: done queuing things up, now waiting for results queue to drain 46400 1727204578.79126: waiting for pending results... 46400 1727204578.80112: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204578.80349: in run() - task 0affcd87-79f5-1303-fda8-000000001668 46400 1727204578.80482: variable 'ansible_search_path' from source: unknown 46400 1727204578.80487: variable 'ansible_search_path' from source: unknown 46400 1727204578.80521: calling self._execute() 46400 1727204578.80731: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.80736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.80747: variable 'omit' from source: magic vars 46400 1727204578.81696: variable 'ansible_distribution_major_version' from source: facts 46400 1727204578.81708: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204578.81715: variable 'omit' from source: magic vars 46400 1727204578.81905: variable 'omit' from source: magic vars 46400 1727204578.82121: variable 'profile' from source: play vars 46400 1727204578.82125: variable 'interface' from source: play vars 46400 1727204578.82193: variable 'interface' from source: play vars 46400 1727204578.82326: variable 'omit' from source: magic vars 46400 1727204578.82377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204578.82411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204578.82489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204578.82505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.82517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204578.82592: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204578.82595: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.82600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.82822: Set connection var ansible_shell_type to sh 46400 1727204578.82827: Set connection var ansible_shell_executable to /bin/sh 46400 1727204578.82831: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204578.82843: Set connection var ansible_connection to ssh 46400 1727204578.82846: Set connection var ansible_pipelining to False 46400 1727204578.82849: Set connection var ansible_timeout to 10 46400 1727204578.83482: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.83490: variable 'ansible_connection' from source: unknown 46400 1727204578.83497: variable 'ansible_module_compression' from source: unknown 46400 1727204578.83503: variable 'ansible_shell_type' from source: unknown 46400 1727204578.83509: variable 'ansible_shell_executable' from source: unknown 46400 1727204578.83523: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204578.83531: variable 'ansible_pipelining' from source: unknown 46400 1727204578.83538: variable 'ansible_timeout' from source: unknown 46400 1727204578.83545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204578.83697: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204578.83716: variable 'omit' from source: magic vars 46400 1727204578.83726: starting attempt loop 46400 1727204578.83739: running the handler 46400 1727204578.83754: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204578.83782: _low_level_execute_command(): starting 46400 1727204578.83795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204578.84605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.84653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.84674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.84694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.84747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.84783: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.84840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.84871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.85039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.85203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.85219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.85234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.85253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.85274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.85288: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.85303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.85390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.85431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.85451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.85533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.87172: stdout chunk (state=3): >>>/root <<< 46400 1727204578.87374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.87378: stdout chunk (state=3): >>><<< 46400 1727204578.87381: stderr chunk (state=3): >>><<< 46400 1727204578.87472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.87485: _low_level_execute_command(): starting 46400 1727204578.87490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826 `" && echo ansible-tmp-1727204578.8740623-51160-245512507607826="` echo /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826 `" ) && sleep 0' 46400 1727204578.88114: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.88118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.88157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204578.88161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.88167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.88238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.88241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.88243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.88300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.90169: stdout chunk (state=3): >>>ansible-tmp-1727204578.8740623-51160-245512507607826=/root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826 <<< 46400 1727204578.90285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.90365: stderr chunk (state=3): >>><<< 46400 1727204578.90369: stdout chunk (state=3): >>><<< 46400 1727204578.90616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204578.8740623-51160-245512507607826=/root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204578.90620: variable 'ansible_module_compression' from source: unknown 46400 1727204578.90622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204578.90624: variable 'ansible_facts' from source: unknown 46400 1727204578.90667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/AnsiballZ_command.py 46400 1727204578.91378: Sending initial data 46400 1727204578.91381: Sent initial data (156 bytes) 46400 1727204578.93194: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.93211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.93226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.93246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.93408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.93421: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.93435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.93453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.93467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.93481: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.93500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.93515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.93531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.93543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.93554: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.93573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.93657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.93731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.93746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.93945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204578.95619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204578.95655: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204578.95703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp_u5_umg8 /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/AnsiballZ_command.py <<< 46400 1727204578.95741: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204578.97160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204578.97286: stderr chunk (state=3): >>><<< 46400 1727204578.97290: stdout chunk (state=3): >>><<< 46400 1727204578.97296: done transferring module to remote 46400 1727204578.97298: _low_level_execute_command(): starting 46400 1727204578.97301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/ /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/AnsiballZ_command.py && sleep 0' 46400 1727204578.98752: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204578.98822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.98836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.98853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.98898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.98939: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204578.98952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.98970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204578.99000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204578.99009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204578.99022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204578.99037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204578.99110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204578.99123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204578.99137: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204578.99156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204578.99233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204578.99305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204578.99321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204578.99479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204579.01181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204579.01224: stderr chunk (state=3): >>><<< 46400 1727204579.01227: stdout chunk (state=3): >>><<< 46400 1727204579.01316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204579.01319: _low_level_execute_command(): starting 46400 1727204579.01321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/AnsiballZ_command.py && sleep 0' 46400 1727204579.02657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.02661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204579.02699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.02703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.02705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.02766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204579.03007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204579.03021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204579.03110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204579.18194: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:59.162473", "end": "2024-09-24 15:02:59.180890", "delta": "0:00:00.018417", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204579.19388: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204579.19452: stderr chunk (state=3): >>><<< 46400 1727204579.19456: stdout chunk (state=3): >>><<< 46400 1727204579.19484: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:02:59.162473", "end": "2024-09-24 15:02:59.180890", "delta": "0:00:00.018417", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204579.19527: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204579.19534: _low_level_execute_command(): starting 46400 1727204579.19540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204578.8740623-51160-245512507607826/ > /dev/null 2>&1 && sleep 0' 46400 1727204579.20933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204579.20945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204579.20960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.20980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204579.21018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204579.21025: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204579.21034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.21052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204579.21060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204579.21074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204579.21082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204579.21091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.21103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204579.21110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204579.21116: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204579.21125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.21207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204579.21226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204579.21237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204579.21308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204579.23104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204579.23258: stderr chunk (state=3): >>><<< 46400 1727204579.23262: stdout chunk (state=3): >>><<< 46400 1727204579.23291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204579.23304: handler run complete 46400 1727204579.23355: Evaluated conditional (False): False 46400 1727204579.23389: attempt loop complete, returning result 46400 1727204579.23393: _execute() done 46400 1727204579.23399: dumping result to json 46400 1727204579.23412: done dumping result, returning 46400 1727204579.23434: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-000000001668] 46400 1727204579.23441: sending task result for task 0affcd87-79f5-1303-fda8-000000001668 46400 1727204579.23595: done sending task result for task 0affcd87-79f5-1303-fda8-000000001668 46400 1727204579.23599: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018417", "end": "2024-09-24 15:02:59.180890", "rc": 1, "start": "2024-09-24 15:02:59.162473" } MSG: non-zero return code ...ignoring 46400 1727204579.23666: no more pending results, returning what we have 46400 1727204579.23670: results queue empty 46400 1727204579.23671: checking for any_errors_fatal 46400 1727204579.23679: done checking for any_errors_fatal 46400 1727204579.23679: checking for max_fail_percentage 46400 1727204579.23682: done checking for max_fail_percentage 46400 1727204579.23682: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.23683: done checking to see if all hosts have failed 46400 1727204579.23684: getting the remaining hosts for this loop 46400 1727204579.23686: done getting the remaining hosts for this loop 46400 1727204579.23689: getting the next task for host managed-node2 46400 1727204579.23697: done getting next task for host managed-node2 46400 1727204579.23700: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204579.23705: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.23708: getting variables 46400 1727204579.23710: in VariableManager get_vars() 46400 1727204579.23748: Calling all_inventory to load vars for managed-node2 46400 1727204579.23750: Calling groups_inventory to load vars for managed-node2 46400 1727204579.23754: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.23774: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.23778: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.23782: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.27346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.29202: done with get_vars() 46400 1727204579.29243: done getting variables 46400 1727204579.29339: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.521) 0:01:09.578 ***** 46400 1727204579.29419: entering _queue_task() for managed-node2/set_fact 46400 1727204579.29870: worker is 1 (out of 1 available) 46400 1727204579.29882: exiting _queue_task() for managed-node2/set_fact 46400 1727204579.29906: done queuing things up, now waiting for results queue to drain 46400 1727204579.29908: waiting for pending results... 46400 1727204579.30278: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204579.30476: in run() - task 0affcd87-79f5-1303-fda8-000000001669 46400 1727204579.30505: variable 'ansible_search_path' from source: unknown 46400 1727204579.30517: variable 'ansible_search_path' from source: unknown 46400 1727204579.30572: calling self._execute() 46400 1727204579.30709: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.30722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.30737: variable 'omit' from source: magic vars 46400 1727204579.31495: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.31521: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.31697: variable 'nm_profile_exists' from source: set_fact 46400 1727204579.31726: Evaluated conditional (nm_profile_exists.rc == 0): False 46400 1727204579.31739: when evaluation is False, skipping this task 46400 1727204579.31749: _execute() done 46400 1727204579.31758: dumping result to json 46400 1727204579.31773: done dumping result, returning 46400 1727204579.31784: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-000000001669] 46400 1727204579.31799: sending task result for task 0affcd87-79f5-1303-fda8-000000001669 skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 46400 1727204579.31984: no more pending results, returning what we have 46400 1727204579.31989: results queue empty 46400 1727204579.31990: checking for any_errors_fatal 46400 1727204579.32003: done checking for any_errors_fatal 46400 1727204579.32004: checking for max_fail_percentage 46400 1727204579.32006: done checking for max_fail_percentage 46400 1727204579.32007: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.32008: done checking to see if all hosts have failed 46400 1727204579.32009: getting the remaining hosts for this loop 46400 1727204579.32011: done getting the remaining hosts for this loop 46400 1727204579.32016: getting the next task for host managed-node2 46400 1727204579.32028: done getting next task for host managed-node2 46400 1727204579.32031: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204579.32040: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.32046: getting variables 46400 1727204579.32048: in VariableManager get_vars() 46400 1727204579.32099: Calling all_inventory to load vars for managed-node2 46400 1727204579.32103: Calling groups_inventory to load vars for managed-node2 46400 1727204579.32108: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.32123: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.32126: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.32129: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.33201: done sending task result for task 0affcd87-79f5-1303-fda8-000000001669 46400 1727204579.33206: WORKER PROCESS EXITING 46400 1727204579.34450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.37660: done with get_vars() 46400 1727204579.37701: done getting variables 46400 1727204579.37770: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.38046: variable 'profile' from source: play vars 46400 1727204579.38050: variable 'interface' from source: play vars 46400 1727204579.38117: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.087) 0:01:09.669 ***** 46400 1727204579.38504: entering _queue_task() for managed-node2/command 46400 1727204579.40074: worker is 1 (out of 1 available) 46400 1727204579.40089: exiting _queue_task() for managed-node2/command 46400 1727204579.40100: done queuing things up, now waiting for results queue to drain 46400 1727204579.40102: waiting for pending results... 46400 1727204579.40665: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204579.40872: in run() - task 0affcd87-79f5-1303-fda8-00000000166b 46400 1727204579.40892: variable 'ansible_search_path' from source: unknown 46400 1727204579.40896: variable 'ansible_search_path' from source: unknown 46400 1727204579.40936: calling self._execute() 46400 1727204579.41133: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.41142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.41153: variable 'omit' from source: magic vars 46400 1727204579.41818: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.41830: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.42095: variable 'profile_stat' from source: set_fact 46400 1727204579.42099: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204579.42104: when evaluation is False, skipping this task 46400 1727204579.42107: _execute() done 46400 1727204579.42109: dumping result to json 46400 1727204579.42111: done dumping result, returning 46400 1727204579.42114: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000166b] 46400 1727204579.42119: sending task result for task 0affcd87-79f5-1303-fda8-00000000166b 46400 1727204579.42234: done sending task result for task 0affcd87-79f5-1303-fda8-00000000166b 46400 1727204579.42239: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204579.42296: no more pending results, returning what we have 46400 1727204579.42301: results queue empty 46400 1727204579.42302: checking for any_errors_fatal 46400 1727204579.42312: done checking for any_errors_fatal 46400 1727204579.42313: checking for max_fail_percentage 46400 1727204579.42314: done checking for max_fail_percentage 46400 1727204579.42315: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.42316: done checking to see if all hosts have failed 46400 1727204579.42317: getting the remaining hosts for this loop 46400 1727204579.42319: done getting the remaining hosts for this loop 46400 1727204579.42324: getting the next task for host managed-node2 46400 1727204579.42332: done getting next task for host managed-node2 46400 1727204579.42334: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204579.42340: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.42345: getting variables 46400 1727204579.42346: in VariableManager get_vars() 46400 1727204579.42389: Calling all_inventory to load vars for managed-node2 46400 1727204579.42392: Calling groups_inventory to load vars for managed-node2 46400 1727204579.42396: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.42410: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.42412: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.42414: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.44972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.46957: done with get_vars() 46400 1727204579.47014: done getting variables 46400 1727204579.47134: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.47299: variable 'profile' from source: play vars 46400 1727204579.47303: variable 'interface' from source: play vars 46400 1727204579.47368: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.089) 0:01:09.758 ***** 46400 1727204579.47408: entering _queue_task() for managed-node2/set_fact 46400 1727204579.47763: worker is 1 (out of 1 available) 46400 1727204579.47778: exiting _queue_task() for managed-node2/set_fact 46400 1727204579.47791: done queuing things up, now waiting for results queue to drain 46400 1727204579.47792: waiting for pending results... 46400 1727204579.48219: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204579.48355: in run() - task 0affcd87-79f5-1303-fda8-00000000166c 46400 1727204579.48392: variable 'ansible_search_path' from source: unknown 46400 1727204579.48396: variable 'ansible_search_path' from source: unknown 46400 1727204579.48476: calling self._execute() 46400 1727204579.48640: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.48646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.48656: variable 'omit' from source: magic vars 46400 1727204579.49018: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.49028: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.49118: variable 'profile_stat' from source: set_fact 46400 1727204579.49128: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204579.49131: when evaluation is False, skipping this task 46400 1727204579.49135: _execute() done 46400 1727204579.49138: dumping result to json 46400 1727204579.49140: done dumping result, returning 46400 1727204579.49146: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000166c] 46400 1727204579.49152: sending task result for task 0affcd87-79f5-1303-fda8-00000000166c 46400 1727204579.49246: done sending task result for task 0affcd87-79f5-1303-fda8-00000000166c 46400 1727204579.49249: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204579.49298: no more pending results, returning what we have 46400 1727204579.49302: results queue empty 46400 1727204579.49303: checking for any_errors_fatal 46400 1727204579.49311: done checking for any_errors_fatal 46400 1727204579.49312: checking for max_fail_percentage 46400 1727204579.49314: done checking for max_fail_percentage 46400 1727204579.49314: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.49315: done checking to see if all hosts have failed 46400 1727204579.49316: getting the remaining hosts for this loop 46400 1727204579.49318: done getting the remaining hosts for this loop 46400 1727204579.49328: getting the next task for host managed-node2 46400 1727204579.49337: done getting next task for host managed-node2 46400 1727204579.49340: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204579.49345: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.49350: getting variables 46400 1727204579.49351: in VariableManager get_vars() 46400 1727204579.49395: Calling all_inventory to load vars for managed-node2 46400 1727204579.49398: Calling groups_inventory to load vars for managed-node2 46400 1727204579.49401: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.49413: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.49415: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.49418: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.50410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.52377: done with get_vars() 46400 1727204579.52411: done getting variables 46400 1727204579.52475: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.52659: variable 'profile' from source: play vars 46400 1727204579.52665: variable 'interface' from source: play vars 46400 1727204579.52769: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.053) 0:01:09.812 ***** 46400 1727204579.52805: entering _queue_task() for managed-node2/command 46400 1727204579.53075: worker is 1 (out of 1 available) 46400 1727204579.53088: exiting _queue_task() for managed-node2/command 46400 1727204579.53101: done queuing things up, now waiting for results queue to drain 46400 1727204579.53102: waiting for pending results... 46400 1727204579.53284: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204579.53369: in run() - task 0affcd87-79f5-1303-fda8-00000000166d 46400 1727204579.53381: variable 'ansible_search_path' from source: unknown 46400 1727204579.53384: variable 'ansible_search_path' from source: unknown 46400 1727204579.53413: calling self._execute() 46400 1727204579.53488: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.53494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.53502: variable 'omit' from source: magic vars 46400 1727204579.53768: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.53778: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.53865: variable 'profile_stat' from source: set_fact 46400 1727204579.53873: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204579.53876: when evaluation is False, skipping this task 46400 1727204579.53878: _execute() done 46400 1727204579.53881: dumping result to json 46400 1727204579.53883: done dumping result, returning 46400 1727204579.53889: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000166d] 46400 1727204579.53895: sending task result for task 0affcd87-79f5-1303-fda8-00000000166d 46400 1727204579.53989: done sending task result for task 0affcd87-79f5-1303-fda8-00000000166d 46400 1727204579.53992: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204579.54040: no more pending results, returning what we have 46400 1727204579.54045: results queue empty 46400 1727204579.54046: checking for any_errors_fatal 46400 1727204579.54053: done checking for any_errors_fatal 46400 1727204579.54053: checking for max_fail_percentage 46400 1727204579.54055: done checking for max_fail_percentage 46400 1727204579.54056: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.54057: done checking to see if all hosts have failed 46400 1727204579.54057: getting the remaining hosts for this loop 46400 1727204579.54059: done getting the remaining hosts for this loop 46400 1727204579.54066: getting the next task for host managed-node2 46400 1727204579.54078: done getting next task for host managed-node2 46400 1727204579.54081: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204579.54086: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.54092: getting variables 46400 1727204579.54093: in VariableManager get_vars() 46400 1727204579.54128: Calling all_inventory to load vars for managed-node2 46400 1727204579.54131: Calling groups_inventory to load vars for managed-node2 46400 1727204579.54134: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.54145: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.54147: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.54149: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.55599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.57162: done with get_vars() 46400 1727204579.57188: done getting variables 46400 1727204579.57234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.57328: variable 'profile' from source: play vars 46400 1727204579.57331: variable 'interface' from source: play vars 46400 1727204579.57374: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.045) 0:01:09.858 ***** 46400 1727204579.57401: entering _queue_task() for managed-node2/set_fact 46400 1727204579.57739: worker is 1 (out of 1 available) 46400 1727204579.57750: exiting _queue_task() for managed-node2/set_fact 46400 1727204579.57767: done queuing things up, now waiting for results queue to drain 46400 1727204579.57769: waiting for pending results... 46400 1727204579.58046: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204579.58180: in run() - task 0affcd87-79f5-1303-fda8-00000000166e 46400 1727204579.58198: variable 'ansible_search_path' from source: unknown 46400 1727204579.58207: variable 'ansible_search_path' from source: unknown 46400 1727204579.58243: calling self._execute() 46400 1727204579.58341: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.58351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.58368: variable 'omit' from source: magic vars 46400 1727204579.58732: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.58752: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.58891: variable 'profile_stat' from source: set_fact 46400 1727204579.58908: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204579.58916: when evaluation is False, skipping this task 46400 1727204579.58922: _execute() done 46400 1727204579.58929: dumping result to json 46400 1727204579.58934: done dumping result, returning 46400 1727204579.58942: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000166e] 46400 1727204579.58950: sending task result for task 0affcd87-79f5-1303-fda8-00000000166e 46400 1727204579.59055: done sending task result for task 0affcd87-79f5-1303-fda8-00000000166e 46400 1727204579.59066: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204579.59109: no more pending results, returning what we have 46400 1727204579.59113: results queue empty 46400 1727204579.59114: checking for any_errors_fatal 46400 1727204579.59122: done checking for any_errors_fatal 46400 1727204579.59123: checking for max_fail_percentage 46400 1727204579.59124: done checking for max_fail_percentage 46400 1727204579.59125: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.59126: done checking to see if all hosts have failed 46400 1727204579.59127: getting the remaining hosts for this loop 46400 1727204579.59128: done getting the remaining hosts for this loop 46400 1727204579.59132: getting the next task for host managed-node2 46400 1727204579.59143: done getting next task for host managed-node2 46400 1727204579.59147: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 46400 1727204579.59152: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.59157: getting variables 46400 1727204579.59159: in VariableManager get_vars() 46400 1727204579.59202: Calling all_inventory to load vars for managed-node2 46400 1727204579.59205: Calling groups_inventory to load vars for managed-node2 46400 1727204579.59209: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.59223: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.59225: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.59227: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.60839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.61759: done with get_vars() 46400 1727204579.61783: done getting variables 46400 1727204579.61830: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.61921: variable 'profile' from source: play vars 46400 1727204579.61924: variable 'interface' from source: play vars 46400 1727204579.61967: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.045) 0:01:09.904 ***** 46400 1727204579.61994: entering _queue_task() for managed-node2/assert 46400 1727204579.62238: worker is 1 (out of 1 available) 46400 1727204579.62252: exiting _queue_task() for managed-node2/assert 46400 1727204579.62269: done queuing things up, now waiting for results queue to drain 46400 1727204579.62271: waiting for pending results... 46400 1727204579.62471: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 46400 1727204579.62604: in run() - task 0affcd87-79f5-1303-fda8-0000000015d5 46400 1727204579.62619: variable 'ansible_search_path' from source: unknown 46400 1727204579.62623: variable 'ansible_search_path' from source: unknown 46400 1727204579.62669: calling self._execute() 46400 1727204579.62780: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.62784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.62799: variable 'omit' from source: magic vars 46400 1727204579.63250: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.63260: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.63270: variable 'omit' from source: magic vars 46400 1727204579.63307: variable 'omit' from source: magic vars 46400 1727204579.63463: variable 'profile' from source: play vars 46400 1727204579.63468: variable 'interface' from source: play vars 46400 1727204579.63520: variable 'interface' from source: play vars 46400 1727204579.63532: variable 'omit' from source: magic vars 46400 1727204579.63586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204579.63619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204579.63637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204579.63651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.63663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.63687: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204579.63690: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.63693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.63763: Set connection var ansible_shell_type to sh 46400 1727204579.63771: Set connection var ansible_shell_executable to /bin/sh 46400 1727204579.63776: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204579.63793: Set connection var ansible_connection to ssh 46400 1727204579.63799: Set connection var ansible_pipelining to False 46400 1727204579.63807: Set connection var ansible_timeout to 10 46400 1727204579.63839: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.63842: variable 'ansible_connection' from source: unknown 46400 1727204579.63845: variable 'ansible_module_compression' from source: unknown 46400 1727204579.63847: variable 'ansible_shell_type' from source: unknown 46400 1727204579.63858: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.63870: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.63874: variable 'ansible_pipelining' from source: unknown 46400 1727204579.63877: variable 'ansible_timeout' from source: unknown 46400 1727204579.63880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.64035: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204579.64049: variable 'omit' from source: magic vars 46400 1727204579.64070: starting attempt loop 46400 1727204579.64082: running the handler 46400 1727204579.64206: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204579.64218: Evaluated conditional (not lsr_net_profile_exists): True 46400 1727204579.64221: handler run complete 46400 1727204579.64247: attempt loop complete, returning result 46400 1727204579.64250: _execute() done 46400 1727204579.64253: dumping result to json 46400 1727204579.64255: done dumping result, returning 46400 1727204579.64258: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [0affcd87-79f5-1303-fda8-0000000015d5] 46400 1727204579.64270: sending task result for task 0affcd87-79f5-1303-fda8-0000000015d5 46400 1727204579.64417: done sending task result for task 0affcd87-79f5-1303-fda8-0000000015d5 46400 1727204579.64420: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204579.64599: no more pending results, returning what we have 46400 1727204579.64606: results queue empty 46400 1727204579.64607: checking for any_errors_fatal 46400 1727204579.64682: done checking for any_errors_fatal 46400 1727204579.64684: checking for max_fail_percentage 46400 1727204579.64686: done checking for max_fail_percentage 46400 1727204579.64687: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.64688: done checking to see if all hosts have failed 46400 1727204579.64689: getting the remaining hosts for this loop 46400 1727204579.64693: done getting the remaining hosts for this loop 46400 1727204579.64697: getting the next task for host managed-node2 46400 1727204579.64708: done getting next task for host managed-node2 46400 1727204579.64712: ^ task is: TASK: Conditional asserts 46400 1727204579.64715: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.64722: getting variables 46400 1727204579.64724: in VariableManager get_vars() 46400 1727204579.64772: Calling all_inventory to load vars for managed-node2 46400 1727204579.64775: Calling groups_inventory to load vars for managed-node2 46400 1727204579.64779: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.64795: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.64799: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.64803: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.66348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.67457: done with get_vars() 46400 1727204579.67482: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.055) 0:01:09.960 ***** 46400 1727204579.67559: entering _queue_task() for managed-node2/include_tasks 46400 1727204579.67817: worker is 1 (out of 1 available) 46400 1727204579.67830: exiting _queue_task() for managed-node2/include_tasks 46400 1727204579.67843: done queuing things up, now waiting for results queue to drain 46400 1727204579.67845: waiting for pending results... 46400 1727204579.68033: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204579.68115: in run() - task 0affcd87-79f5-1303-fda8-00000000100b 46400 1727204579.68126: variable 'ansible_search_path' from source: unknown 46400 1727204579.68129: variable 'ansible_search_path' from source: unknown 46400 1727204579.68370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204579.70451: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204579.70524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204579.70554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204579.70608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204579.70633: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204579.70721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204579.70767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204579.70793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204579.70828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204579.70847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204579.70978: dumping result to json 46400 1727204579.70982: done dumping result, returning 46400 1727204579.71001: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-00000000100b] 46400 1727204579.71004: sending task result for task 0affcd87-79f5-1303-fda8-00000000100b 46400 1727204579.71159: done sending task result for task 0affcd87-79f5-1303-fda8-00000000100b 46400 1727204579.71178: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 46400 1727204579.71279: no more pending results, returning what we have 46400 1727204579.71283: results queue empty 46400 1727204579.71283: checking for any_errors_fatal 46400 1727204579.71289: done checking for any_errors_fatal 46400 1727204579.71289: checking for max_fail_percentage 46400 1727204579.71291: done checking for max_fail_percentage 46400 1727204579.71291: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.71292: done checking to see if all hosts have failed 46400 1727204579.71293: getting the remaining hosts for this loop 46400 1727204579.71294: done getting the remaining hosts for this loop 46400 1727204579.71298: getting the next task for host managed-node2 46400 1727204579.71305: done getting next task for host managed-node2 46400 1727204579.71307: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204579.71309: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.71312: getting variables 46400 1727204579.71314: in VariableManager get_vars() 46400 1727204579.71352: Calling all_inventory to load vars for managed-node2 46400 1727204579.71354: Calling groups_inventory to load vars for managed-node2 46400 1727204579.71357: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.71370: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.71373: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.71375: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.72405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.73634: done with get_vars() 46400 1727204579.73652: done getting variables 46400 1727204579.73701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204579.73799: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.062) 0:01:10.022 ***** 46400 1727204579.73827: entering _queue_task() for managed-node2/debug 46400 1727204579.74072: worker is 1 (out of 1 available) 46400 1727204579.74087: exiting _queue_task() for managed-node2/debug 46400 1727204579.74100: done queuing things up, now waiting for results queue to drain 46400 1727204579.74101: waiting for pending results... 46400 1727204579.74326: running TaskExecutor() for managed-node2/TASK: Success in test 'I can remove an existing profile without taking it down' 46400 1727204579.74423: in run() - task 0affcd87-79f5-1303-fda8-00000000100c 46400 1727204579.74446: variable 'ansible_search_path' from source: unknown 46400 1727204579.74450: variable 'ansible_search_path' from source: unknown 46400 1727204579.74471: calling self._execute() 46400 1727204579.74586: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.74589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.74611: variable 'omit' from source: magic vars 46400 1727204579.74905: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.74917: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.74922: variable 'omit' from source: magic vars 46400 1727204579.74952: variable 'omit' from source: magic vars 46400 1727204579.75054: variable 'lsr_description' from source: include params 46400 1727204579.75075: variable 'omit' from source: magic vars 46400 1727204579.75125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204579.75153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204579.75184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204579.75198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.75207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.75249: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204579.75252: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.75255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.75383: Set connection var ansible_shell_type to sh 46400 1727204579.75387: Set connection var ansible_shell_executable to /bin/sh 46400 1727204579.75389: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204579.75393: Set connection var ansible_connection to ssh 46400 1727204579.75395: Set connection var ansible_pipelining to False 46400 1727204579.75412: Set connection var ansible_timeout to 10 46400 1727204579.75440: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.75443: variable 'ansible_connection' from source: unknown 46400 1727204579.75446: variable 'ansible_module_compression' from source: unknown 46400 1727204579.75448: variable 'ansible_shell_type' from source: unknown 46400 1727204579.75450: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.75452: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.75455: variable 'ansible_pipelining' from source: unknown 46400 1727204579.75457: variable 'ansible_timeout' from source: unknown 46400 1727204579.75459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.75583: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204579.75609: variable 'omit' from source: magic vars 46400 1727204579.75612: starting attempt loop 46400 1727204579.75615: running the handler 46400 1727204579.75647: handler run complete 46400 1727204579.75658: attempt loop complete, returning result 46400 1727204579.75661: _execute() done 46400 1727204579.75677: dumping result to json 46400 1727204579.75680: done dumping result, returning 46400 1727204579.75691: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can remove an existing profile without taking it down' [0affcd87-79f5-1303-fda8-00000000100c] 46400 1727204579.75693: sending task result for task 0affcd87-79f5-1303-fda8-00000000100c 46400 1727204579.75817: done sending task result for task 0affcd87-79f5-1303-fda8-00000000100c 46400 1727204579.75819: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 46400 1727204579.75905: no more pending results, returning what we have 46400 1727204579.75909: results queue empty 46400 1727204579.75910: checking for any_errors_fatal 46400 1727204579.75916: done checking for any_errors_fatal 46400 1727204579.75917: checking for max_fail_percentage 46400 1727204579.75919: done checking for max_fail_percentage 46400 1727204579.75919: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.75920: done checking to see if all hosts have failed 46400 1727204579.75921: getting the remaining hosts for this loop 46400 1727204579.75945: done getting the remaining hosts for this loop 46400 1727204579.75970: getting the next task for host managed-node2 46400 1727204579.75978: done getting next task for host managed-node2 46400 1727204579.75981: ^ task is: TASK: Cleanup 46400 1727204579.75983: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.76019: getting variables 46400 1727204579.76043: in VariableManager get_vars() 46400 1727204579.76103: Calling all_inventory to load vars for managed-node2 46400 1727204579.76107: Calling groups_inventory to load vars for managed-node2 46400 1727204579.76111: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.76119: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.76121: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.76124: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.77074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.78223: done with get_vars() 46400 1727204579.78241: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.045) 0:01:10.067 ***** 46400 1727204579.78342: entering _queue_task() for managed-node2/include_tasks 46400 1727204579.78840: worker is 1 (out of 1 available) 46400 1727204579.78859: exiting _queue_task() for managed-node2/include_tasks 46400 1727204579.78889: done queuing things up, now waiting for results queue to drain 46400 1727204579.78891: waiting for pending results... 46400 1727204579.79217: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204579.79395: in run() - task 0affcd87-79f5-1303-fda8-000000001010 46400 1727204579.79399: variable 'ansible_search_path' from source: unknown 46400 1727204579.79402: variable 'ansible_search_path' from source: unknown 46400 1727204579.79418: variable 'lsr_cleanup' from source: include params 46400 1727204579.79672: variable 'lsr_cleanup' from source: include params 46400 1727204579.79969: variable 'omit' from source: magic vars 46400 1727204579.79973: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.79977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.79979: variable 'omit' from source: magic vars 46400 1727204579.80090: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.80105: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.80111: variable 'item' from source: unknown 46400 1727204579.80179: variable 'item' from source: unknown 46400 1727204579.80214: variable 'item' from source: unknown 46400 1727204579.80304: variable 'item' from source: unknown 46400 1727204579.80417: dumping result to json 46400 1727204579.80421: done dumping result, returning 46400 1727204579.80424: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-000000001010] 46400 1727204579.80427: sending task result for task 0affcd87-79f5-1303-fda8-000000001010 46400 1727204579.80468: done sending task result for task 0affcd87-79f5-1303-fda8-000000001010 46400 1727204579.80472: WORKER PROCESS EXITING 46400 1727204579.80498: no more pending results, returning what we have 46400 1727204579.80504: in VariableManager get_vars() 46400 1727204579.80554: Calling all_inventory to load vars for managed-node2 46400 1727204579.80557: Calling groups_inventory to load vars for managed-node2 46400 1727204579.80566: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.80581: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.80585: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.80589: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.89272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.90501: done with get_vars() 46400 1727204579.90520: variable 'ansible_search_path' from source: unknown 46400 1727204579.90521: variable 'ansible_search_path' from source: unknown 46400 1727204579.90548: we have included files to process 46400 1727204579.90549: generating all_blocks data 46400 1727204579.90550: done generating all_blocks data 46400 1727204579.90552: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204579.90553: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204579.90554: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204579.90683: done processing included file 46400 1727204579.90685: iterating over new_blocks loaded from include file 46400 1727204579.90686: in VariableManager get_vars() 46400 1727204579.90697: done with get_vars() 46400 1727204579.90698: filtering new block on tags 46400 1727204579.90714: done filtering new block on tags 46400 1727204579.90715: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204579.90718: extending task lists for all hosts with included blocks 46400 1727204579.91424: done extending task lists 46400 1727204579.91425: done processing included files 46400 1727204579.91426: results queue empty 46400 1727204579.91426: checking for any_errors_fatal 46400 1727204579.91428: done checking for any_errors_fatal 46400 1727204579.91429: checking for max_fail_percentage 46400 1727204579.91430: done checking for max_fail_percentage 46400 1727204579.91430: checking to see if all hosts have failed and the running result is not ok 46400 1727204579.91431: done checking to see if all hosts have failed 46400 1727204579.91431: getting the remaining hosts for this loop 46400 1727204579.91432: done getting the remaining hosts for this loop 46400 1727204579.91433: getting the next task for host managed-node2 46400 1727204579.91436: done getting next task for host managed-node2 46400 1727204579.91437: ^ task is: TASK: Cleanup profile and device 46400 1727204579.91439: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204579.91441: getting variables 46400 1727204579.91441: in VariableManager get_vars() 46400 1727204579.91450: Calling all_inventory to load vars for managed-node2 46400 1727204579.91451: Calling groups_inventory to load vars for managed-node2 46400 1727204579.91453: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204579.91457: Calling all_plugins_play to load vars for managed-node2 46400 1727204579.91458: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204579.91462: Calling groups_plugins_play to load vars for managed-node2 46400 1727204579.92624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204579.93684: done with get_vars() 46400 1727204579.93705: done getting variables 46400 1727204579.93738: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:02:59 -0400 (0:00:00.154) 0:01:10.222 ***** 46400 1727204579.93762: entering _queue_task() for managed-node2/shell 46400 1727204579.94023: worker is 1 (out of 1 available) 46400 1727204579.94037: exiting _queue_task() for managed-node2/shell 46400 1727204579.94050: done queuing things up, now waiting for results queue to drain 46400 1727204579.94052: waiting for pending results... 46400 1727204579.94247: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204579.94328: in run() - task 0affcd87-79f5-1303-fda8-0000000016ad 46400 1727204579.94340: variable 'ansible_search_path' from source: unknown 46400 1727204579.94343: variable 'ansible_search_path' from source: unknown 46400 1727204579.94375: calling self._execute() 46400 1727204579.94449: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.94453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.94466: variable 'omit' from source: magic vars 46400 1727204579.94752: variable 'ansible_distribution_major_version' from source: facts 46400 1727204579.94770: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204579.94776: variable 'omit' from source: magic vars 46400 1727204579.94810: variable 'omit' from source: magic vars 46400 1727204579.94986: variable 'interface' from source: play vars 46400 1727204579.95011: variable 'omit' from source: magic vars 46400 1727204579.95059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204579.95110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204579.95134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204579.95157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.95183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204579.95222: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204579.95234: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.95243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.95345: Set connection var ansible_shell_type to sh 46400 1727204579.95358: Set connection var ansible_shell_executable to /bin/sh 46400 1727204579.95371: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204579.95379: Set connection var ansible_connection to ssh 46400 1727204579.95387: Set connection var ansible_pipelining to False 46400 1727204579.95399: Set connection var ansible_timeout to 10 46400 1727204579.95430: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.95437: variable 'ansible_connection' from source: unknown 46400 1727204579.95442: variable 'ansible_module_compression' from source: unknown 46400 1727204579.95447: variable 'ansible_shell_type' from source: unknown 46400 1727204579.95452: variable 'ansible_shell_executable' from source: unknown 46400 1727204579.95457: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204579.95468: variable 'ansible_pipelining' from source: unknown 46400 1727204579.95475: variable 'ansible_timeout' from source: unknown 46400 1727204579.95481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204579.95620: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204579.95640: variable 'omit' from source: magic vars 46400 1727204579.95648: starting attempt loop 46400 1727204579.95654: running the handler 46400 1727204579.95672: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204579.95695: _low_level_execute_command(): starting 46400 1727204579.95707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204579.96938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204579.96952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204579.96971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.96990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204579.97041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204579.97059: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204579.97080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.97096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204579.97106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204579.97115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204579.97124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204579.97139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204579.97152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204579.97169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204579.97178: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204579.97190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204579.97271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204579.97298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204579.97317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204579.97403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204579.99056: stdout chunk (state=3): >>>/root <<< 46400 1727204579.99157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204579.99231: stderr chunk (state=3): >>><<< 46400 1727204579.99235: stdout chunk (state=3): >>><<< 46400 1727204579.99262: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204579.99280: _low_level_execute_command(): starting 46400 1727204579.99287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380 `" && echo ansible-tmp-1727204579.992625-51362-241378675126380="` echo /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380 `" ) && sleep 0' 46400 1727204580.00074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204580.00094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204580.01943: stdout chunk (state=3): >>>ansible-tmp-1727204579.992625-51362-241378675126380=/root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380 <<< 46400 1727204580.02062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204580.02113: stderr chunk (state=3): >>><<< 46400 1727204580.02117: stdout chunk (state=3): >>><<< 46400 1727204580.02133: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204579.992625-51362-241378675126380=/root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204580.02162: variable 'ansible_module_compression' from source: unknown 46400 1727204580.02211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204580.02241: variable 'ansible_facts' from source: unknown 46400 1727204580.02305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/AnsiballZ_command.py 46400 1727204580.02414: Sending initial data 46400 1727204580.02417: Sent initial data (155 bytes) 46400 1727204580.03086: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.03127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204580.03154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204580.03173: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204580.03188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.03204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204580.03330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204580.03386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204580.05100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204580.05133: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204580.05171: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmppuzbhoed /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/AnsiballZ_command.py <<< 46400 1727204580.05207: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204580.06052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204580.06269: stderr chunk (state=3): >>><<< 46400 1727204580.06272: stdout chunk (state=3): >>><<< 46400 1727204580.06313: done transferring module to remote 46400 1727204580.06334: _low_level_execute_command(): starting 46400 1727204580.06337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/ /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/AnsiballZ_command.py && sleep 0' 46400 1727204580.07080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.07084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204580.07120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204580.07132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.07246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204580.07286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204580.07310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204580.07376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204580.09092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204580.09185: stderr chunk (state=3): >>><<< 46400 1727204580.09191: stdout chunk (state=3): >>><<< 46400 1727204580.09231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204580.09234: _low_level_execute_command(): starting 46400 1727204580.09237: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/AnsiballZ_command.py && sleep 0' 46400 1727204580.09979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.09988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204580.10025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.10031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204580.10039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.10045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204580.10077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204580.10081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.10139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204580.10147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204580.10158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204580.10224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204580.29988: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (32d7bf17-3bad-4841-bdea-bee9f6832024) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:00.233510", "end": "2024-09-24 15:03:00.298572", "delta": "0:00:00.065062", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204580.31195: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204580.31271: stderr chunk (state=3): >>><<< 46400 1727204580.31275: stdout chunk (state=3): >>><<< 46400 1727204580.31375: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (32d7bf17-3bad-4841-bdea-bee9f6832024) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:00.233510", "end": "2024-09-24 15:03:00.298572", "delta": "0:00:00.065062", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204580.31379: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204580.31387: _low_level_execute_command(): starting 46400 1727204580.31389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204579.992625-51362-241378675126380/ > /dev/null 2>&1 && sleep 0' 46400 1727204580.32443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204580.32468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.32488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204580.32511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204580.32614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204580.32631: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204580.32646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.32665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204580.32775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204580.32789: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204580.32803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204580.32818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204580.32834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204580.32849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204580.32862: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204580.32886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204580.32960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204580.32991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204580.33008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204580.33079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204580.34954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204580.35000: stderr chunk (state=3): >>><<< 46400 1727204580.35004: stdout chunk (state=3): >>><<< 46400 1727204580.35075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204580.35079: handler run complete 46400 1727204580.35273: Evaluated conditional (False): False 46400 1727204580.35277: attempt loop complete, returning result 46400 1727204580.35279: _execute() done 46400 1727204580.35281: dumping result to json 46400 1727204580.35283: done dumping result, returning 46400 1727204580.35285: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-0000000016ad] 46400 1727204580.35287: sending task result for task 0affcd87-79f5-1303-fda8-0000000016ad 46400 1727204580.35362: done sending task result for task 0affcd87-79f5-1303-fda8-0000000016ad 46400 1727204580.35367: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.065062", "end": "2024-09-24 15:03:00.298572", "rc": 1, "start": "2024-09-24 15:03:00.233510" } STDOUT: Connection 'statebr' (32d7bf17-3bad-4841-bdea-bee9f6832024) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204580.35441: no more pending results, returning what we have 46400 1727204580.35445: results queue empty 46400 1727204580.35446: checking for any_errors_fatal 46400 1727204580.35449: done checking for any_errors_fatal 46400 1727204580.35450: checking for max_fail_percentage 46400 1727204580.35451: done checking for max_fail_percentage 46400 1727204580.35453: checking to see if all hosts have failed and the running result is not ok 46400 1727204580.35454: done checking to see if all hosts have failed 46400 1727204580.35454: getting the remaining hosts for this loop 46400 1727204580.35456: done getting the remaining hosts for this loop 46400 1727204580.35460: getting the next task for host managed-node2 46400 1727204580.35478: done getting next task for host managed-node2 46400 1727204580.35482: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204580.35484: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204580.35489: getting variables 46400 1727204580.35490: in VariableManager get_vars() 46400 1727204580.35532: Calling all_inventory to load vars for managed-node2 46400 1727204580.35535: Calling groups_inventory to load vars for managed-node2 46400 1727204580.35539: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.35551: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.35554: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.35559: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.38759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204580.42547: done with get_vars() 46400 1727204580.42587: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Tuesday 24 September 2024 15:03:00 -0400 (0:00:00.490) 0:01:10.712 ***** 46400 1727204580.42803: entering _queue_task() for managed-node2/include_tasks 46400 1727204580.44441: worker is 1 (out of 1 available) 46400 1727204580.44454: exiting _queue_task() for managed-node2/include_tasks 46400 1727204580.44529: done queuing things up, now waiting for results queue to drain 46400 1727204580.44532: waiting for pending results... 46400 1727204580.45526: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204580.45621: in run() - task 0affcd87-79f5-1303-fda8-000000000015 46400 1727204580.45635: variable 'ansible_search_path' from source: unknown 46400 1727204580.45980: calling self._execute() 46400 1727204580.46079: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.46086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.46097: variable 'omit' from source: magic vars 46400 1727204580.46999: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.47011: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.47018: _execute() done 46400 1727204580.47021: dumping result to json 46400 1727204580.47024: done dumping result, returning 46400 1727204580.47030: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-000000000015] 46400 1727204580.47036: sending task result for task 0affcd87-79f5-1303-fda8-000000000015 46400 1727204580.47462: done sending task result for task 0affcd87-79f5-1303-fda8-000000000015 46400 1727204580.47467: WORKER PROCESS EXITING 46400 1727204580.47495: no more pending results, returning what we have 46400 1727204580.47500: in VariableManager get_vars() 46400 1727204580.47547: Calling all_inventory to load vars for managed-node2 46400 1727204580.47550: Calling groups_inventory to load vars for managed-node2 46400 1727204580.47553: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.47570: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.47574: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.47578: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.50088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204580.54538: done with get_vars() 46400 1727204580.54611: variable 'ansible_search_path' from source: unknown 46400 1727204580.54627: we have included files to process 46400 1727204580.54628: generating all_blocks data 46400 1727204580.54630: done generating all_blocks data 46400 1727204580.54640: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204580.54641: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204580.54643: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204580.56127: in VariableManager get_vars() 46400 1727204580.56148: done with get_vars() 46400 1727204580.56397: in VariableManager get_vars() 46400 1727204580.56418: done with get_vars() 46400 1727204580.56642: in VariableManager get_vars() 46400 1727204580.56699: done with get_vars() 46400 1727204580.56886: in VariableManager get_vars() 46400 1727204580.56903: done with get_vars() 46400 1727204580.57094: in VariableManager get_vars() 46400 1727204580.57165: done with get_vars() 46400 1727204580.58201: in VariableManager get_vars() 46400 1727204580.58219: done with get_vars() 46400 1727204580.58230: done processing included file 46400 1727204580.58232: iterating over new_blocks loaded from include file 46400 1727204580.58233: in VariableManager get_vars() 46400 1727204580.58244: done with get_vars() 46400 1727204580.58246: filtering new block on tags 46400 1727204580.58462: done filtering new block on tags 46400 1727204580.58466: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204580.58471: extending task lists for all hosts with included blocks 46400 1727204580.58621: done extending task lists 46400 1727204580.58622: done processing included files 46400 1727204580.58623: results queue empty 46400 1727204580.58623: checking for any_errors_fatal 46400 1727204580.58629: done checking for any_errors_fatal 46400 1727204580.58630: checking for max_fail_percentage 46400 1727204580.58631: done checking for max_fail_percentage 46400 1727204580.58632: checking to see if all hosts have failed and the running result is not ok 46400 1727204580.58632: done checking to see if all hosts have failed 46400 1727204580.58633: getting the remaining hosts for this loop 46400 1727204580.58634: done getting the remaining hosts for this loop 46400 1727204580.58637: getting the next task for host managed-node2 46400 1727204580.58640: done getting next task for host managed-node2 46400 1727204580.58643: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204580.58645: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204580.58647: getting variables 46400 1727204580.58648: in VariableManager get_vars() 46400 1727204580.58657: Calling all_inventory to load vars for managed-node2 46400 1727204580.58659: Calling groups_inventory to load vars for managed-node2 46400 1727204580.58662: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.58668: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.58671: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.58673: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.61685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204580.63806: done with get_vars() 46400 1727204580.63869: done getting variables 46400 1727204580.63920: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204580.64057: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:03:00 -0400 (0:00:00.213) 0:01:10.925 ***** 46400 1727204580.64121: entering _queue_task() for managed-node2/debug 46400 1727204580.64484: worker is 1 (out of 1 available) 46400 1727204580.64508: exiting _queue_task() for managed-node2/debug 46400 1727204580.64526: done queuing things up, now waiting for results queue to drain 46400 1727204580.64528: waiting for pending results... 46400 1727204580.64828: running TaskExecutor() for managed-node2/TASK: TEST: I can take a profile down that is absent 46400 1727204580.64988: in run() - task 0affcd87-79f5-1303-fda8-000000001744 46400 1727204580.65018: variable 'ansible_search_path' from source: unknown 46400 1727204580.65028: variable 'ansible_search_path' from source: unknown 46400 1727204580.65094: calling self._execute() 46400 1727204580.65250: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.65266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.65303: variable 'omit' from source: magic vars 46400 1727204580.65769: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.65788: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.65799: variable 'omit' from source: magic vars 46400 1727204580.65861: variable 'omit' from source: magic vars 46400 1727204580.66043: variable 'lsr_description' from source: include params 46400 1727204580.66085: variable 'omit' from source: magic vars 46400 1727204580.66135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204580.66196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.66239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204580.66266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.66287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.66346: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.66382: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.66397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.66515: Set connection var ansible_shell_type to sh 46400 1727204580.66530: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.66549: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.66572: Set connection var ansible_connection to ssh 46400 1727204580.66586: Set connection var ansible_pipelining to False 46400 1727204580.66618: Set connection var ansible_timeout to 10 46400 1727204580.66668: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.66686: variable 'ansible_connection' from source: unknown 46400 1727204580.66705: variable 'ansible_module_compression' from source: unknown 46400 1727204580.66722: variable 'ansible_shell_type' from source: unknown 46400 1727204580.66734: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.66742: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.66750: variable 'ansible_pipelining' from source: unknown 46400 1727204580.66756: variable 'ansible_timeout' from source: unknown 46400 1727204580.66765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.67013: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.67052: variable 'omit' from source: magic vars 46400 1727204580.67069: starting attempt loop 46400 1727204580.67076: running the handler 46400 1727204580.67139: handler run complete 46400 1727204580.67163: attempt loop complete, returning result 46400 1727204580.67175: _execute() done 46400 1727204580.67182: dumping result to json 46400 1727204580.67189: done dumping result, returning 46400 1727204580.67199: done running TaskExecutor() for managed-node2/TASK: TEST: I can take a profile down that is absent [0affcd87-79f5-1303-fda8-000000001744] 46400 1727204580.67208: sending task result for task 0affcd87-79f5-1303-fda8-000000001744 ok: [managed-node2] => {} MSG: ########## I can take a profile down that is absent ########## 46400 1727204580.67393: no more pending results, returning what we have 46400 1727204580.67397: results queue empty 46400 1727204580.67399: checking for any_errors_fatal 46400 1727204580.67400: done checking for any_errors_fatal 46400 1727204580.67401: checking for max_fail_percentage 46400 1727204580.67403: done checking for max_fail_percentage 46400 1727204580.67404: checking to see if all hosts have failed and the running result is not ok 46400 1727204580.67405: done checking to see if all hosts have failed 46400 1727204580.67405: getting the remaining hosts for this loop 46400 1727204580.67407: done getting the remaining hosts for this loop 46400 1727204580.67411: getting the next task for host managed-node2 46400 1727204580.67420: done getting next task for host managed-node2 46400 1727204580.67423: ^ task is: TASK: Show item 46400 1727204580.67426: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204580.67431: getting variables 46400 1727204580.67432: in VariableManager get_vars() 46400 1727204580.67491: Calling all_inventory to load vars for managed-node2 46400 1727204580.67494: Calling groups_inventory to load vars for managed-node2 46400 1727204580.67498: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.67511: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.67514: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.67517: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.68539: done sending task result for task 0affcd87-79f5-1303-fda8-000000001744 46400 1727204580.68543: WORKER PROCESS EXITING 46400 1727204580.69982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204580.72246: done with get_vars() 46400 1727204580.72286: done getting variables 46400 1727204580.72473: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:03:00 -0400 (0:00:00.083) 0:01:11.009 ***** 46400 1727204580.72504: entering _queue_task() for managed-node2/debug 46400 1727204580.73128: worker is 1 (out of 1 available) 46400 1727204580.73142: exiting _queue_task() for managed-node2/debug 46400 1727204580.73155: done queuing things up, now waiting for results queue to drain 46400 1727204580.73157: waiting for pending results... 46400 1727204580.73480: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204580.73598: in run() - task 0affcd87-79f5-1303-fda8-000000001745 46400 1727204580.73626: variable 'ansible_search_path' from source: unknown 46400 1727204580.73636: variable 'ansible_search_path' from source: unknown 46400 1727204580.73705: variable 'omit' from source: magic vars 46400 1727204580.73890: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.73907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.73923: variable 'omit' from source: magic vars 46400 1727204580.74329: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.74346: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.74356: variable 'omit' from source: magic vars 46400 1727204580.74412: variable 'omit' from source: magic vars 46400 1727204580.74468: variable 'item' from source: unknown 46400 1727204580.74549: variable 'item' from source: unknown 46400 1727204580.74578: variable 'omit' from source: magic vars 46400 1727204580.74633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204580.74678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.74709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204580.74737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.74882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.74916: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.74926: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.74933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.75039: Set connection var ansible_shell_type to sh 46400 1727204580.75051: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.75059: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.75071: Set connection var ansible_connection to ssh 46400 1727204580.75079: Set connection var ansible_pipelining to False 46400 1727204580.75086: Set connection var ansible_timeout to 10 46400 1727204580.75113: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.75119: variable 'ansible_connection' from source: unknown 46400 1727204580.75124: variable 'ansible_module_compression' from source: unknown 46400 1727204580.75128: variable 'ansible_shell_type' from source: unknown 46400 1727204580.75133: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.75137: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.75143: variable 'ansible_pipelining' from source: unknown 46400 1727204580.75147: variable 'ansible_timeout' from source: unknown 46400 1727204580.75152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.75292: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.75312: variable 'omit' from source: magic vars 46400 1727204580.75322: starting attempt loop 46400 1727204580.75328: running the handler 46400 1727204580.76008: variable 'lsr_description' from source: include params 46400 1727204580.76173: variable 'lsr_description' from source: include params 46400 1727204580.76219: handler run complete 46400 1727204580.76241: attempt loop complete, returning result 46400 1727204580.76335: variable 'item' from source: unknown 46400 1727204580.76407: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 46400 1727204580.76780: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.76831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.76846: variable 'omit' from source: magic vars 46400 1727204580.77199: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.77268: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.77280: variable 'omit' from source: magic vars 46400 1727204580.77299: variable 'omit' from source: magic vars 46400 1727204580.77345: variable 'item' from source: unknown 46400 1727204580.77451: variable 'item' from source: unknown 46400 1727204580.77479: variable 'omit' from source: magic vars 46400 1727204580.77507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.77525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.77536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.77553: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.77567: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.77577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.77662: Set connection var ansible_shell_type to sh 46400 1727204580.77682: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.77697: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.77707: Set connection var ansible_connection to ssh 46400 1727204580.77716: Set connection var ansible_pipelining to False 46400 1727204580.77727: Set connection var ansible_timeout to 10 46400 1727204580.77751: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.77788: variable 'ansible_connection' from source: unknown 46400 1727204580.77801: variable 'ansible_module_compression' from source: unknown 46400 1727204580.77809: variable 'ansible_shell_type' from source: unknown 46400 1727204580.77816: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.77896: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.77909: variable 'ansible_pipelining' from source: unknown 46400 1727204580.77917: variable 'ansible_timeout' from source: unknown 46400 1727204580.77924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.78146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.78165: variable 'omit' from source: magic vars 46400 1727204580.78175: starting attempt loop 46400 1727204580.78181: running the handler 46400 1727204580.78207: variable 'lsr_setup' from source: include params 46400 1727204580.78305: variable 'lsr_setup' from source: include params 46400 1727204580.78489: handler run complete 46400 1727204580.78508: attempt loop complete, returning result 46400 1727204580.78569: variable 'item' from source: unknown 46400 1727204580.78720: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 46400 1727204580.79022: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.79036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.79049: variable 'omit' from source: magic vars 46400 1727204580.79338: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.79507: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.79516: variable 'omit' from source: magic vars 46400 1727204580.79534: variable 'omit' from source: magic vars 46400 1727204580.79584: variable 'item' from source: unknown 46400 1727204580.79655: variable 'item' from source: unknown 46400 1727204580.79733: variable 'omit' from source: magic vars 46400 1727204580.79758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.79837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.79849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.79871: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.79879: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.79913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.80037: Set connection var ansible_shell_type to sh 46400 1727204580.80110: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.80169: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.80180: Set connection var ansible_connection to ssh 46400 1727204580.80190: Set connection var ansible_pipelining to False 46400 1727204580.80200: Set connection var ansible_timeout to 10 46400 1727204580.80226: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.80275: variable 'ansible_connection' from source: unknown 46400 1727204580.80290: variable 'ansible_module_compression' from source: unknown 46400 1727204580.80324: variable 'ansible_shell_type' from source: unknown 46400 1727204580.80332: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.80339: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.80347: variable 'ansible_pipelining' from source: unknown 46400 1727204580.80383: variable 'ansible_timeout' from source: unknown 46400 1727204580.80392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.80621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.80635: variable 'omit' from source: magic vars 46400 1727204580.80709: starting attempt loop 46400 1727204580.80717: running the handler 46400 1727204580.80742: variable 'lsr_test' from source: include params 46400 1727204580.80894: variable 'lsr_test' from source: include params 46400 1727204580.80940: handler run complete 46400 1727204580.80986: attempt loop complete, returning result 46400 1727204580.81059: variable 'item' from source: unknown 46400 1727204580.81271: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 46400 1727204580.81431: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.81495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.81537: variable 'omit' from source: magic vars 46400 1727204580.81884: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.81943: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.81953: variable 'omit' from source: magic vars 46400 1727204580.81978: variable 'omit' from source: magic vars 46400 1727204580.82023: variable 'item' from source: unknown 46400 1727204580.82098: variable 'item' from source: unknown 46400 1727204580.82117: variable 'omit' from source: magic vars 46400 1727204580.82147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.82159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.82177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.82193: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.82201: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.82208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.82295: Set connection var ansible_shell_type to sh 46400 1727204580.82308: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.82317: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.82326: Set connection var ansible_connection to ssh 46400 1727204580.82335: Set connection var ansible_pipelining to False 46400 1727204580.82343: Set connection var ansible_timeout to 10 46400 1727204580.82378: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.82387: variable 'ansible_connection' from source: unknown 46400 1727204580.82395: variable 'ansible_module_compression' from source: unknown 46400 1727204580.82401: variable 'ansible_shell_type' from source: unknown 46400 1727204580.82408: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.82414: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.82422: variable 'ansible_pipelining' from source: unknown 46400 1727204580.82429: variable 'ansible_timeout' from source: unknown 46400 1727204580.82437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.82537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.82551: variable 'omit' from source: magic vars 46400 1727204580.82559: starting attempt loop 46400 1727204580.82572: running the handler 46400 1727204580.82602: variable 'lsr_assert' from source: include params 46400 1727204580.82672: variable 'lsr_assert' from source: include params 46400 1727204580.82700: handler run complete 46400 1727204580.82717: attempt loop complete, returning result 46400 1727204580.82735: variable 'item' from source: unknown 46400 1727204580.82806: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 46400 1727204580.82976: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.82991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.83005: variable 'omit' from source: magic vars 46400 1727204580.83472: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.83483: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.83496: variable 'omit' from source: magic vars 46400 1727204580.83514: variable 'omit' from source: magic vars 46400 1727204580.83614: variable 'item' from source: unknown 46400 1727204580.83735: variable 'item' from source: unknown 46400 1727204580.83895: variable 'omit' from source: magic vars 46400 1727204580.83922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.83934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.83944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.83959: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.83971: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.83979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.84065: Set connection var ansible_shell_type to sh 46400 1727204580.84230: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.84242: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.84252: Set connection var ansible_connection to ssh 46400 1727204580.84268: Set connection var ansible_pipelining to False 46400 1727204580.84280: Set connection var ansible_timeout to 10 46400 1727204580.84305: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.84312: variable 'ansible_connection' from source: unknown 46400 1727204580.84323: variable 'ansible_module_compression' from source: unknown 46400 1727204580.84333: variable 'ansible_shell_type' from source: unknown 46400 1727204580.84340: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.84346: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.84353: variable 'ansible_pipelining' from source: unknown 46400 1727204580.84358: variable 'ansible_timeout' from source: unknown 46400 1727204580.84458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.84724: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.84811: variable 'omit' from source: magic vars 46400 1727204580.84877: starting attempt loop 46400 1727204580.85009: running the handler 46400 1727204580.85035: variable 'lsr_assert_when' from source: include params 46400 1727204580.85189: variable 'lsr_assert_when' from source: include params 46400 1727204580.85350: variable 'network_provider' from source: set_fact 46400 1727204580.85393: handler run complete 46400 1727204580.85414: attempt loop complete, returning result 46400 1727204580.85434: variable 'item' from source: unknown 46400 1727204580.85509: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 46400 1727204580.85690: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.85703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.85717: variable 'omit' from source: magic vars 46400 1727204580.85892: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.85903: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.85912: variable 'omit' from source: magic vars 46400 1727204580.85931: variable 'omit' from source: magic vars 46400 1727204580.85986: variable 'item' from source: unknown 46400 1727204580.86050: variable 'item' from source: unknown 46400 1727204580.86079: variable 'omit' from source: magic vars 46400 1727204580.86102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.86114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.86125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.86140: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.86148: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.86156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.86242: Set connection var ansible_shell_type to sh 46400 1727204580.86256: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.86273: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.86290: Set connection var ansible_connection to ssh 46400 1727204580.86300: Set connection var ansible_pipelining to False 46400 1727204580.86310: Set connection var ansible_timeout to 10 46400 1727204580.86335: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.86343: variable 'ansible_connection' from source: unknown 46400 1727204580.86350: variable 'ansible_module_compression' from source: unknown 46400 1727204580.86357: variable 'ansible_shell_type' from source: unknown 46400 1727204580.86369: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.86375: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.86384: variable 'ansible_pipelining' from source: unknown 46400 1727204580.86396: variable 'ansible_timeout' from source: unknown 46400 1727204580.86405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.86511: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.86524: variable 'omit' from source: magic vars 46400 1727204580.86534: starting attempt loop 46400 1727204580.86540: running the handler 46400 1727204580.86573: variable 'lsr_fail_debug' from source: play vars 46400 1727204580.86642: variable 'lsr_fail_debug' from source: play vars 46400 1727204580.86669: handler run complete 46400 1727204580.86688: attempt loop complete, returning result 46400 1727204580.86707: variable 'item' from source: unknown 46400 1727204580.86982: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204580.87296: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.87312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.87326: variable 'omit' from source: magic vars 46400 1727204580.87502: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.87513: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.87521: variable 'omit' from source: magic vars 46400 1727204580.87539: variable 'omit' from source: magic vars 46400 1727204580.87596: variable 'item' from source: unknown 46400 1727204580.87665: variable 'item' from source: unknown 46400 1727204580.87691: variable 'omit' from source: magic vars 46400 1727204580.87716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204580.87733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.87744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204580.87758: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204580.87771: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.87784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.87854: Set connection var ansible_shell_type to sh 46400 1727204580.87875: Set connection var ansible_shell_executable to /bin/sh 46400 1727204580.87892: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204580.87919: Set connection var ansible_connection to ssh 46400 1727204580.87930: Set connection var ansible_pipelining to False 46400 1727204580.87940: Set connection var ansible_timeout to 10 46400 1727204580.87972: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.87980: variable 'ansible_connection' from source: unknown 46400 1727204580.87988: variable 'ansible_module_compression' from source: unknown 46400 1727204580.88002: variable 'ansible_shell_type' from source: unknown 46400 1727204580.88008: variable 'ansible_shell_executable' from source: unknown 46400 1727204580.88015: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.88023: variable 'ansible_pipelining' from source: unknown 46400 1727204580.88029: variable 'ansible_timeout' from source: unknown 46400 1727204580.88037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.88136: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204580.88149: variable 'omit' from source: magic vars 46400 1727204580.88157: starting attempt loop 46400 1727204580.88168: running the handler 46400 1727204580.88192: variable 'lsr_cleanup' from source: include params 46400 1727204580.88267: variable 'lsr_cleanup' from source: include params 46400 1727204580.88288: handler run complete 46400 1727204580.88304: attempt loop complete, returning result 46400 1727204580.88329: variable 'item' from source: unknown 46400 1727204580.88395: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 46400 1727204580.88516: dumping result to json 46400 1727204580.88530: done dumping result, returning 46400 1727204580.88543: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-000000001745] 46400 1727204580.88554: sending task result for task 0affcd87-79f5-1303-fda8-000000001745 46400 1727204580.88706: no more pending results, returning what we have 46400 1727204580.88712: results queue empty 46400 1727204580.88713: checking for any_errors_fatal 46400 1727204580.88722: done checking for any_errors_fatal 46400 1727204580.88723: checking for max_fail_percentage 46400 1727204580.88725: done checking for max_fail_percentage 46400 1727204580.88726: checking to see if all hosts have failed and the running result is not ok 46400 1727204580.88727: done checking to see if all hosts have failed 46400 1727204580.88728: getting the remaining hosts for this loop 46400 1727204580.88730: done getting the remaining hosts for this loop 46400 1727204580.88734: getting the next task for host managed-node2 46400 1727204580.88742: done getting next task for host managed-node2 46400 1727204580.88745: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204580.88749: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204580.88753: getting variables 46400 1727204580.88755: in VariableManager get_vars() 46400 1727204580.88801: Calling all_inventory to load vars for managed-node2 46400 1727204580.88804: Calling groups_inventory to load vars for managed-node2 46400 1727204580.88809: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.88821: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.88824: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.88827: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.89893: done sending task result for task 0affcd87-79f5-1303-fda8-000000001745 46400 1727204580.89896: WORKER PROCESS EXITING 46400 1727204580.90957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204580.93942: done with get_vars() 46400 1727204580.94001: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:03:00 -0400 (0:00:00.217) 0:01:11.226 ***** 46400 1727204580.94222: entering _queue_task() for managed-node2/include_tasks 46400 1727204580.94592: worker is 1 (out of 1 available) 46400 1727204580.94607: exiting _queue_task() for managed-node2/include_tasks 46400 1727204580.94623: done queuing things up, now waiting for results queue to drain 46400 1727204580.94625: waiting for pending results... 46400 1727204580.94938: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204580.95065: in run() - task 0affcd87-79f5-1303-fda8-000000001746 46400 1727204580.95093: variable 'ansible_search_path' from source: unknown 46400 1727204580.95101: variable 'ansible_search_path' from source: unknown 46400 1727204580.95142: calling self._execute() 46400 1727204580.95248: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204580.95259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204580.95279: variable 'omit' from source: magic vars 46400 1727204580.95797: variable 'ansible_distribution_major_version' from source: facts 46400 1727204580.95815: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204580.95844: _execute() done 46400 1727204580.95949: dumping result to json 46400 1727204580.95966: done dumping result, returning 46400 1727204580.95980: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-000000001746] 46400 1727204580.95991: sending task result for task 0affcd87-79f5-1303-fda8-000000001746 46400 1727204580.96123: no more pending results, returning what we have 46400 1727204580.96129: in VariableManager get_vars() 46400 1727204580.96188: Calling all_inventory to load vars for managed-node2 46400 1727204580.96191: Calling groups_inventory to load vars for managed-node2 46400 1727204580.96195: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204580.96212: Calling all_plugins_play to load vars for managed-node2 46400 1727204580.96216: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204580.96219: Calling groups_plugins_play to load vars for managed-node2 46400 1727204580.97316: done sending task result for task 0affcd87-79f5-1303-fda8-000000001746 46400 1727204580.97320: WORKER PROCESS EXITING 46400 1727204580.99040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.02447: done with get_vars() 46400 1727204581.02485: variable 'ansible_search_path' from source: unknown 46400 1727204581.02487: variable 'ansible_search_path' from source: unknown 46400 1727204581.02529: we have included files to process 46400 1727204581.02531: generating all_blocks data 46400 1727204581.02647: done generating all_blocks data 46400 1727204581.02654: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204581.02655: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204581.02658: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204581.02993: in VariableManager get_vars() 46400 1727204581.03018: done with get_vars() 46400 1727204581.03256: done processing included file 46400 1727204581.03258: iterating over new_blocks loaded from include file 46400 1727204581.03262: in VariableManager get_vars() 46400 1727204581.03284: done with get_vars() 46400 1727204581.03286: filtering new block on tags 46400 1727204581.03398: done filtering new block on tags 46400 1727204581.03512: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204581.03518: extending task lists for all hosts with included blocks 46400 1727204581.04405: done extending task lists 46400 1727204581.04407: done processing included files 46400 1727204581.04408: results queue empty 46400 1727204581.04409: checking for any_errors_fatal 46400 1727204581.04417: done checking for any_errors_fatal 46400 1727204581.04418: checking for max_fail_percentage 46400 1727204581.04419: done checking for max_fail_percentage 46400 1727204581.04420: checking to see if all hosts have failed and the running result is not ok 46400 1727204581.04420: done checking to see if all hosts have failed 46400 1727204581.04421: getting the remaining hosts for this loop 46400 1727204581.04422: done getting the remaining hosts for this loop 46400 1727204581.04425: getting the next task for host managed-node2 46400 1727204581.04429: done getting next task for host managed-node2 46400 1727204581.04431: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204581.04435: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204581.04437: getting variables 46400 1727204581.04439: in VariableManager get_vars() 46400 1727204581.04451: Calling all_inventory to load vars for managed-node2 46400 1727204581.04454: Calling groups_inventory to load vars for managed-node2 46400 1727204581.04456: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.04462: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.04466: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.04470: Calling groups_plugins_play to load vars for managed-node2 46400 1727204581.06138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.08968: done with get_vars() 46400 1727204581.09004: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:01 -0400 (0:00:00.148) 0:01:11.375 ***** 46400 1727204581.09097: entering _queue_task() for managed-node2/include_tasks 46400 1727204581.09451: worker is 1 (out of 1 available) 46400 1727204581.09468: exiting _queue_task() for managed-node2/include_tasks 46400 1727204581.09482: done queuing things up, now waiting for results queue to drain 46400 1727204581.09487: waiting for pending results... 46400 1727204581.09789: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204581.09899: in run() - task 0affcd87-79f5-1303-fda8-00000000176d 46400 1727204581.09920: variable 'ansible_search_path' from source: unknown 46400 1727204581.09933: variable 'ansible_search_path' from source: unknown 46400 1727204581.09976: calling self._execute() 46400 1727204581.10344: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.10356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.10824: variable 'omit' from source: magic vars 46400 1727204581.12206: variable 'ansible_distribution_major_version' from source: facts 46400 1727204581.12551: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204581.12586: _execute() done 46400 1727204581.12636: dumping result to json 46400 1727204581.12709: done dumping result, returning 46400 1727204581.12720: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-00000000176d] 46400 1727204581.12731: sending task result for task 0affcd87-79f5-1303-fda8-00000000176d 46400 1727204581.12925: no more pending results, returning what we have 46400 1727204581.12936: in VariableManager get_vars() 46400 1727204581.13020: Calling all_inventory to load vars for managed-node2 46400 1727204581.13024: Calling groups_inventory to load vars for managed-node2 46400 1727204581.13029: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.13057: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.13066: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.13071: Calling groups_plugins_play to load vars for managed-node2 46400 1727204581.14822: done sending task result for task 0affcd87-79f5-1303-fda8-00000000176d 46400 1727204581.14826: WORKER PROCESS EXITING 46400 1727204581.15665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.19049: done with get_vars() 46400 1727204581.19178: variable 'ansible_search_path' from source: unknown 46400 1727204581.19180: variable 'ansible_search_path' from source: unknown 46400 1727204581.19248: we have included files to process 46400 1727204581.19249: generating all_blocks data 46400 1727204581.19251: done generating all_blocks data 46400 1727204581.19253: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204581.19254: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204581.19256: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204581.19581: done processing included file 46400 1727204581.19584: iterating over new_blocks loaded from include file 46400 1727204581.19586: in VariableManager get_vars() 46400 1727204581.19606: done with get_vars() 46400 1727204581.19608: filtering new block on tags 46400 1727204581.19657: done filtering new block on tags 46400 1727204581.19660: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204581.19667: extending task lists for all hosts with included blocks 46400 1727204581.19861: done extending task lists 46400 1727204581.19863: done processing included files 46400 1727204581.19865: results queue empty 46400 1727204581.19865: checking for any_errors_fatal 46400 1727204581.19869: done checking for any_errors_fatal 46400 1727204581.19869: checking for max_fail_percentage 46400 1727204581.19871: done checking for max_fail_percentage 46400 1727204581.19872: checking to see if all hosts have failed and the running result is not ok 46400 1727204581.19872: done checking to see if all hosts have failed 46400 1727204581.19873: getting the remaining hosts for this loop 46400 1727204581.19874: done getting the remaining hosts for this loop 46400 1727204581.19877: getting the next task for host managed-node2 46400 1727204581.19881: done getting next task for host managed-node2 46400 1727204581.19883: ^ task is: TASK: Gather current interface info 46400 1727204581.19887: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204581.19890: getting variables 46400 1727204581.19891: in VariableManager get_vars() 46400 1727204581.19903: Calling all_inventory to load vars for managed-node2 46400 1727204581.19905: Calling groups_inventory to load vars for managed-node2 46400 1727204581.19907: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.19913: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.19915: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.19918: Calling groups_plugins_play to load vars for managed-node2 46400 1727204581.21806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.24556: done with get_vars() 46400 1727204581.24593: done getting variables 46400 1727204581.24643: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:01 -0400 (0:00:00.155) 0:01:11.531 ***** 46400 1727204581.24680: entering _queue_task() for managed-node2/command 46400 1727204581.25097: worker is 1 (out of 1 available) 46400 1727204581.25184: exiting _queue_task() for managed-node2/command 46400 1727204581.25225: done queuing things up, now waiting for results queue to drain 46400 1727204581.25288: waiting for pending results... 46400 1727204581.26178: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204581.27336: in run() - task 0affcd87-79f5-1303-fda8-0000000017a8 46400 1727204581.27357: variable 'ansible_search_path' from source: unknown 46400 1727204581.27371: variable 'ansible_search_path' from source: unknown 46400 1727204581.27422: calling self._execute() 46400 1727204581.27536: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.27549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.27566: variable 'omit' from source: magic vars 46400 1727204581.28198: variable 'ansible_distribution_major_version' from source: facts 46400 1727204581.28216: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204581.28294: variable 'omit' from source: magic vars 46400 1727204581.28355: variable 'omit' from source: magic vars 46400 1727204581.28511: variable 'omit' from source: magic vars 46400 1727204581.28569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204581.28645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204581.28746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204581.28774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.28840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.28880: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204581.28937: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.28948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.29056: Set connection var ansible_shell_type to sh 46400 1727204581.29076: Set connection var ansible_shell_executable to /bin/sh 46400 1727204581.29086: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204581.29095: Set connection var ansible_connection to ssh 46400 1727204581.29104: Set connection var ansible_pipelining to False 46400 1727204581.29113: Set connection var ansible_timeout to 10 46400 1727204581.29142: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.29152: variable 'ansible_connection' from source: unknown 46400 1727204581.29168: variable 'ansible_module_compression' from source: unknown 46400 1727204581.29175: variable 'ansible_shell_type' from source: unknown 46400 1727204581.29182: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.29188: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.29196: variable 'ansible_pipelining' from source: unknown 46400 1727204581.29202: variable 'ansible_timeout' from source: unknown 46400 1727204581.29209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.29358: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204581.29386: variable 'omit' from source: magic vars 46400 1727204581.29396: starting attempt loop 46400 1727204581.29402: running the handler 46400 1727204581.29422: _low_level_execute_command(): starting 46400 1727204581.29433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204581.30970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.30975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.30997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204581.31001: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.31073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.31393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.31640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.33582: stdout chunk (state=3): >>>/root <<< 46400 1727204581.33588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204581.33591: stdout chunk (state=3): >>><<< 46400 1727204581.33596: stderr chunk (state=3): >>><<< 46400 1727204581.34190: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204581.34194: _low_level_execute_command(): starting 46400 1727204581.34199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065 `" && echo ansible-tmp-1727204581.336333-51470-241328405849065="` echo /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065 `" ) && sleep 0' 46400 1727204581.35811: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.35816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.35905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.35909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.35945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.35952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.36120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204581.36126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.36144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.36280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.38078: stdout chunk (state=3): >>>ansible-tmp-1727204581.336333-51470-241328405849065=/root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065 <<< 46400 1727204581.38293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204581.38297: stdout chunk (state=3): >>><<< 46400 1727204581.38370: stderr chunk (state=3): >>><<< 46400 1727204581.38374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204581.336333-51470-241328405849065=/root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204581.38377: variable 'ansible_module_compression' from source: unknown 46400 1727204581.38552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204581.38555: variable 'ansible_facts' from source: unknown 46400 1727204581.38598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/AnsiballZ_command.py 46400 1727204581.39267: Sending initial data 46400 1727204581.39272: Sent initial data (155 bytes) 46400 1727204581.41739: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204581.41755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.41776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.41796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.41953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.41972: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204581.41988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.42006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204581.42019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204581.42031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204581.42049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.42069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.42086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.42100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.42113: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204581.42127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.42209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204581.42292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.42309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.42379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.44129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204581.44179: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204581.44271: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp4vcy6wlr /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/AnsiballZ_command.py <<< 46400 1727204581.44274: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204581.46267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204581.46498: stderr chunk (state=3): >>><<< 46400 1727204581.46502: stdout chunk (state=3): >>><<< 46400 1727204581.46562: done transferring module to remote 46400 1727204581.46596: _low_level_execute_command(): starting 46400 1727204581.46650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/ /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/AnsiballZ_command.py && sleep 0' 46400 1727204581.48418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.48422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.48457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204581.48467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.48470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.48665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.48713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.48926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.50699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204581.50703: stdout chunk (state=3): >>><<< 46400 1727204581.50718: stderr chunk (state=3): >>><<< 46400 1727204581.50736: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204581.50740: _low_level_execute_command(): starting 46400 1727204581.50742: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/AnsiballZ_command.py && sleep 0' 46400 1727204581.52576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204581.52718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.52736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.52753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.52874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.52882: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204581.52900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.52927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204581.52931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204581.52970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204581.52985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.53000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.53046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.53055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.53062: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204581.53078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.53239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204581.53376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.53383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.53536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.67087: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:01.666625", "end": "2024-09-24 15:03:01.669916", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204581.68296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204581.68387: stderr chunk (state=3): >>><<< 46400 1727204581.68390: stdout chunk (state=3): >>><<< 46400 1727204581.68402: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:01.666625", "end": "2024-09-24 15:03:01.669916", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204581.68444: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204581.68451: _low_level_execute_command(): starting 46400 1727204581.68456: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204581.336333-51470-241328405849065/ > /dev/null 2>&1 && sleep 0' 46400 1727204581.69971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204581.70284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.70294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.70309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.70352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.70363: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204581.70369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.70383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204581.70390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204581.70396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204581.70404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204581.70413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204581.70424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204581.70431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204581.70438: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204581.70448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204581.70530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204581.70539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204581.70547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204581.70685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204581.72670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204581.72674: stdout chunk (state=3): >>><<< 46400 1727204581.72676: stderr chunk (state=3): >>><<< 46400 1727204581.72678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204581.72680: handler run complete 46400 1727204581.72681: Evaluated conditional (False): False 46400 1727204581.72683: attempt loop complete, returning result 46400 1727204581.72684: _execute() done 46400 1727204581.72686: dumping result to json 46400 1727204581.72687: done dumping result, returning 46400 1727204581.72689: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-0000000017a8] 46400 1727204581.72691: sending task result for task 0affcd87-79f5-1303-fda8-0000000017a8 46400 1727204581.72756: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017a8 46400 1727204581.72759: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003291", "end": "2024-09-24 15:03:01.669916", "rc": 0, "start": "2024-09-24 15:03:01.666625" } STDOUT: bonding_masters eth0 lo 46400 1727204581.72834: no more pending results, returning what we have 46400 1727204581.72837: results queue empty 46400 1727204581.72839: checking for any_errors_fatal 46400 1727204581.72841: done checking for any_errors_fatal 46400 1727204581.72841: checking for max_fail_percentage 46400 1727204581.72843: done checking for max_fail_percentage 46400 1727204581.72844: checking to see if all hosts have failed and the running result is not ok 46400 1727204581.72844: done checking to see if all hosts have failed 46400 1727204581.72845: getting the remaining hosts for this loop 46400 1727204581.72846: done getting the remaining hosts for this loop 46400 1727204581.72850: getting the next task for host managed-node2 46400 1727204581.72857: done getting next task for host managed-node2 46400 1727204581.72859: ^ task is: TASK: Set current_interfaces 46400 1727204581.72887: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204581.72892: getting variables 46400 1727204581.72893: in VariableManager get_vars() 46400 1727204581.72927: Calling all_inventory to load vars for managed-node2 46400 1727204581.72930: Calling groups_inventory to load vars for managed-node2 46400 1727204581.72933: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.72973: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.73004: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.73009: Calling groups_plugins_play to load vars for managed-node2 46400 1727204581.77210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.81482: done with get_vars() 46400 1727204581.81630: done getting variables 46400 1727204581.81697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:01 -0400 (0:00:00.571) 0:01:12.102 ***** 46400 1727204581.81799: entering _queue_task() for managed-node2/set_fact 46400 1727204581.82516: worker is 1 (out of 1 available) 46400 1727204581.82530: exiting _queue_task() for managed-node2/set_fact 46400 1727204581.82543: done queuing things up, now waiting for results queue to drain 46400 1727204581.82545: waiting for pending results... 46400 1727204581.83051: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204581.83153: in run() - task 0affcd87-79f5-1303-fda8-0000000017a9 46400 1727204581.83168: variable 'ansible_search_path' from source: unknown 46400 1727204581.83172: variable 'ansible_search_path' from source: unknown 46400 1727204581.83206: calling self._execute() 46400 1727204581.84159: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.84171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.84208: variable 'omit' from source: magic vars 46400 1727204581.85393: variable 'ansible_distribution_major_version' from source: facts 46400 1727204581.85449: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204581.85500: variable 'omit' from source: magic vars 46400 1727204581.85687: variable 'omit' from source: magic vars 46400 1727204581.85949: variable '_current_interfaces' from source: set_fact 46400 1727204581.86225: variable 'omit' from source: magic vars 46400 1727204581.86365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204581.86508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204581.86588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204581.86685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.86710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.86788: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204581.86792: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.86795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.86903: Set connection var ansible_shell_type to sh 46400 1727204581.86909: Set connection var ansible_shell_executable to /bin/sh 46400 1727204581.86912: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204581.86918: Set connection var ansible_connection to ssh 46400 1727204581.86947: Set connection var ansible_pipelining to False 46400 1727204581.87381: Set connection var ansible_timeout to 10 46400 1727204581.87385: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.87387: variable 'ansible_connection' from source: unknown 46400 1727204581.87389: variable 'ansible_module_compression' from source: unknown 46400 1727204581.87391: variable 'ansible_shell_type' from source: unknown 46400 1727204581.87393: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.87394: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.87396: variable 'ansible_pipelining' from source: unknown 46400 1727204581.87397: variable 'ansible_timeout' from source: unknown 46400 1727204581.87399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.87852: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204581.87859: variable 'omit' from source: magic vars 46400 1727204581.87869: starting attempt loop 46400 1727204581.87872: running the handler 46400 1727204581.87887: handler run complete 46400 1727204581.87896: attempt loop complete, returning result 46400 1727204581.87899: _execute() done 46400 1727204581.87902: dumping result to json 46400 1727204581.87904: done dumping result, returning 46400 1727204581.87913: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-0000000017a9] 46400 1727204581.87918: sending task result for task 0affcd87-79f5-1303-fda8-0000000017a9 46400 1727204581.88009: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017a9 46400 1727204581.88012: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204581.88071: no more pending results, returning what we have 46400 1727204581.88075: results queue empty 46400 1727204581.88076: checking for any_errors_fatal 46400 1727204581.88087: done checking for any_errors_fatal 46400 1727204581.88087: checking for max_fail_percentage 46400 1727204581.88089: done checking for max_fail_percentage 46400 1727204581.88090: checking to see if all hosts have failed and the running result is not ok 46400 1727204581.88091: done checking to see if all hosts have failed 46400 1727204581.88092: getting the remaining hosts for this loop 46400 1727204581.88093: done getting the remaining hosts for this loop 46400 1727204581.88098: getting the next task for host managed-node2 46400 1727204581.88108: done getting next task for host managed-node2 46400 1727204581.88111: ^ task is: TASK: Show current_interfaces 46400 1727204581.88115: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204581.88119: getting variables 46400 1727204581.88121: in VariableManager get_vars() 46400 1727204581.88158: Calling all_inventory to load vars for managed-node2 46400 1727204581.88161: Calling groups_inventory to load vars for managed-node2 46400 1727204581.88166: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.88176: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.88179: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.88181: Calling groups_plugins_play to load vars for managed-node2 46400 1727204581.91043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204581.95366: done with get_vars() 46400 1727204581.95402: done getting variables 46400 1727204581.95474: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:01 -0400 (0:00:00.137) 0:01:12.239 ***** 46400 1727204581.95510: entering _queue_task() for managed-node2/debug 46400 1727204581.96170: worker is 1 (out of 1 available) 46400 1727204581.96185: exiting _queue_task() for managed-node2/debug 46400 1727204581.96198: done queuing things up, now waiting for results queue to drain 46400 1727204581.96200: waiting for pending results... 46400 1727204581.96780: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204581.97045: in run() - task 0affcd87-79f5-1303-fda8-00000000176e 46400 1727204581.97067: variable 'ansible_search_path' from source: unknown 46400 1727204581.97076: variable 'ansible_search_path' from source: unknown 46400 1727204581.97128: calling self._execute() 46400 1727204581.97232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.97245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.97260: variable 'omit' from source: magic vars 46400 1727204581.97709: variable 'ansible_distribution_major_version' from source: facts 46400 1727204581.97748: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204581.97761: variable 'omit' from source: magic vars 46400 1727204581.97821: variable 'omit' from source: magic vars 46400 1727204581.97935: variable 'current_interfaces' from source: set_fact 46400 1727204581.97972: variable 'omit' from source: magic vars 46400 1727204581.98029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204581.98074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204581.98108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204581.98131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.98147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204581.98217: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204581.98227: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.98236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.98339: Set connection var ansible_shell_type to sh 46400 1727204581.98355: Set connection var ansible_shell_executable to /bin/sh 46400 1727204581.98369: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204581.98380: Set connection var ansible_connection to ssh 46400 1727204581.98390: Set connection var ansible_pipelining to False 46400 1727204581.98400: Set connection var ansible_timeout to 10 46400 1727204581.98566: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.98576: variable 'ansible_connection' from source: unknown 46400 1727204581.98583: variable 'ansible_module_compression' from source: unknown 46400 1727204581.98590: variable 'ansible_shell_type' from source: unknown 46400 1727204581.98597: variable 'ansible_shell_executable' from source: unknown 46400 1727204581.98604: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204581.98612: variable 'ansible_pipelining' from source: unknown 46400 1727204581.98619: variable 'ansible_timeout' from source: unknown 46400 1727204581.98626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204581.99121: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204581.99142: variable 'omit' from source: magic vars 46400 1727204581.99152: starting attempt loop 46400 1727204581.99168: running the handler 46400 1727204581.99326: handler run complete 46400 1727204581.99349: attempt loop complete, returning result 46400 1727204581.99360: _execute() done 46400 1727204581.99369: dumping result to json 46400 1727204581.99376: done dumping result, returning 46400 1727204581.99387: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-00000000176e] 46400 1727204581.99396: sending task result for task 0affcd87-79f5-1303-fda8-00000000176e ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204581.99693: no more pending results, returning what we have 46400 1727204581.99697: results queue empty 46400 1727204581.99698: checking for any_errors_fatal 46400 1727204581.99706: done checking for any_errors_fatal 46400 1727204581.99707: checking for max_fail_percentage 46400 1727204581.99709: done checking for max_fail_percentage 46400 1727204581.99710: checking to see if all hosts have failed and the running result is not ok 46400 1727204581.99711: done checking to see if all hosts have failed 46400 1727204581.99711: getting the remaining hosts for this loop 46400 1727204581.99714: done getting the remaining hosts for this loop 46400 1727204581.99718: getting the next task for host managed-node2 46400 1727204581.99728: done getting next task for host managed-node2 46400 1727204581.99732: ^ task is: TASK: Setup 46400 1727204581.99736: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204581.99741: getting variables 46400 1727204581.99743: in VariableManager get_vars() 46400 1727204581.99789: Calling all_inventory to load vars for managed-node2 46400 1727204581.99793: Calling groups_inventory to load vars for managed-node2 46400 1727204581.99797: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204581.99809: Calling all_plugins_play to load vars for managed-node2 46400 1727204581.99813: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204581.99816: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.01102: done sending task result for task 0affcd87-79f5-1303-fda8-00000000176e 46400 1727204582.01106: WORKER PROCESS EXITING 46400 1727204582.03180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.05594: done with get_vars() 46400 1727204582.05634: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.102) 0:01:12.341 ***** 46400 1727204582.05742: entering _queue_task() for managed-node2/include_tasks 46400 1727204582.06314: worker is 1 (out of 1 available) 46400 1727204582.06328: exiting _queue_task() for managed-node2/include_tasks 46400 1727204582.06342: done queuing things up, now waiting for results queue to drain 46400 1727204582.06344: waiting for pending results... 46400 1727204582.07455: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204582.07729: in run() - task 0affcd87-79f5-1303-fda8-000000001747 46400 1727204582.07742: variable 'ansible_search_path' from source: unknown 46400 1727204582.07890: variable 'ansible_search_path' from source: unknown 46400 1727204582.07938: variable 'lsr_setup' from source: include params 46400 1727204582.08387: variable 'lsr_setup' from source: include params 46400 1727204582.08657: variable 'omit' from source: magic vars 46400 1727204582.08928: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.08939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.08946: variable 'omit' from source: magic vars 46400 1727204582.10320: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.10327: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.10336: variable 'item' from source: unknown 46400 1727204582.10642: variable 'item' from source: unknown 46400 1727204582.10683: variable 'item' from source: unknown 46400 1727204582.10981: variable 'item' from source: unknown 46400 1727204582.11301: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.11308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.11319: variable 'omit' from source: magic vars 46400 1727204582.11956: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.11969: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.11972: variable 'item' from source: unknown 46400 1727204582.12039: variable 'item' from source: unknown 46400 1727204582.12404: variable 'item' from source: unknown 46400 1727204582.12468: variable 'item' from source: unknown 46400 1727204582.12769: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.12780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.12794: variable 'omit' from source: magic vars 46400 1727204582.13408: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.13413: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.13420: variable 'item' from source: unknown 46400 1727204582.13710: variable 'item' from source: unknown 46400 1727204582.13741: variable 'item' from source: unknown 46400 1727204582.13985: variable 'item' from source: unknown 46400 1727204582.14053: dumping result to json 46400 1727204582.14056: done dumping result, returning 46400 1727204582.14058: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-000000001747] 46400 1727204582.14061: sending task result for task 0affcd87-79f5-1303-fda8-000000001747 46400 1727204582.14101: done sending task result for task 0affcd87-79f5-1303-fda8-000000001747 46400 1727204582.14104: WORKER PROCESS EXITING 46400 1727204582.14135: no more pending results, returning what we have 46400 1727204582.14141: in VariableManager get_vars() 46400 1727204582.14192: Calling all_inventory to load vars for managed-node2 46400 1727204582.14195: Calling groups_inventory to load vars for managed-node2 46400 1727204582.14199: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.14216: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.14221: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.14224: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.15862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.18811: done with get_vars() 46400 1727204582.19246: variable 'ansible_search_path' from source: unknown 46400 1727204582.19248: variable 'ansible_search_path' from source: unknown 46400 1727204582.19303: variable 'ansible_search_path' from source: unknown 46400 1727204582.19304: variable 'ansible_search_path' from source: unknown 46400 1727204582.19337: variable 'ansible_search_path' from source: unknown 46400 1727204582.19338: variable 'ansible_search_path' from source: unknown 46400 1727204582.19367: we have included files to process 46400 1727204582.19369: generating all_blocks data 46400 1727204582.19370: done generating all_blocks data 46400 1727204582.19450: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204582.19452: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204582.19456: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204582.19776: done processing included file 46400 1727204582.19779: iterating over new_blocks loaded from include file 46400 1727204582.19780: in VariableManager get_vars() 46400 1727204582.19797: done with get_vars() 46400 1727204582.19799: filtering new block on tags 46400 1727204582.19842: done filtering new block on tags 46400 1727204582.19845: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 46400 1727204582.19850: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204582.19851: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204582.19854: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204582.19956: done processing included file 46400 1727204582.19957: iterating over new_blocks loaded from include file 46400 1727204582.19959: in VariableManager get_vars() 46400 1727204582.19978: done with get_vars() 46400 1727204582.19979: filtering new block on tags 46400 1727204582.20003: done filtering new block on tags 46400 1727204582.20005: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 46400 1727204582.20009: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204582.20010: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204582.20013: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 46400 1727204582.20108: done processing included file 46400 1727204582.20110: iterating over new_blocks loaded from include file 46400 1727204582.20111: in VariableManager get_vars() 46400 1727204582.20125: done with get_vars() 46400 1727204582.20127: filtering new block on tags 46400 1727204582.20156: done filtering new block on tags 46400 1727204582.20159: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed-node2 => (item=tasks/remove_profile.yml) 46400 1727204582.20162: extending task lists for all hosts with included blocks 46400 1727204582.21048: done extending task lists 46400 1727204582.21056: done processing included files 46400 1727204582.21057: results queue empty 46400 1727204582.21058: checking for any_errors_fatal 46400 1727204582.21062: done checking for any_errors_fatal 46400 1727204582.21063: checking for max_fail_percentage 46400 1727204582.21067: done checking for max_fail_percentage 46400 1727204582.21068: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.21069: done checking to see if all hosts have failed 46400 1727204582.21069: getting the remaining hosts for this loop 46400 1727204582.21071: done getting the remaining hosts for this loop 46400 1727204582.21073: getting the next task for host managed-node2 46400 1727204582.21078: done getting next task for host managed-node2 46400 1727204582.21080: ^ task is: TASK: Include network role 46400 1727204582.21129: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.21133: getting variables 46400 1727204582.21134: in VariableManager get_vars() 46400 1727204582.21147: Calling all_inventory to load vars for managed-node2 46400 1727204582.21149: Calling groups_inventory to load vars for managed-node2 46400 1727204582.21152: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.21157: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.21160: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.21163: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.24701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.27632: done with get_vars() 46400 1727204582.27672: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.220) 0:01:12.562 ***** 46400 1727204582.27763: entering _queue_task() for managed-node2/include_role 46400 1727204582.29176: worker is 1 (out of 1 available) 46400 1727204582.29193: exiting _queue_task() for managed-node2/include_role 46400 1727204582.29207: done queuing things up, now waiting for results queue to drain 46400 1727204582.29208: waiting for pending results... 46400 1727204582.29959: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204582.30119: in run() - task 0affcd87-79f5-1303-fda8-0000000017d0 46400 1727204582.30144: variable 'ansible_search_path' from source: unknown 46400 1727204582.30186: variable 'ansible_search_path' from source: unknown 46400 1727204582.30265: calling self._execute() 46400 1727204582.30421: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.30552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.30575: variable 'omit' from source: magic vars 46400 1727204582.31461: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.31540: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.31556: _execute() done 46400 1727204582.31618: dumping result to json 46400 1727204582.31679: done dumping result, returning 46400 1727204582.31725: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-0000000017d0] 46400 1727204582.31793: sending task result for task 0affcd87-79f5-1303-fda8-0000000017d0 46400 1727204582.32029: no more pending results, returning what we have 46400 1727204582.32036: in VariableManager get_vars() 46400 1727204582.32090: Calling all_inventory to load vars for managed-node2 46400 1727204582.32093: Calling groups_inventory to load vars for managed-node2 46400 1727204582.32097: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.32115: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.32119: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.32122: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.33268: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017d0 46400 1727204582.33274: WORKER PROCESS EXITING 46400 1727204582.34099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.37322: done with get_vars() 46400 1727204582.37344: variable 'ansible_search_path' from source: unknown 46400 1727204582.37346: variable 'ansible_search_path' from source: unknown 46400 1727204582.37538: variable 'omit' from source: magic vars 46400 1727204582.37584: variable 'omit' from source: magic vars 46400 1727204582.37601: variable 'omit' from source: magic vars 46400 1727204582.37606: we have included files to process 46400 1727204582.37607: generating all_blocks data 46400 1727204582.37608: done generating all_blocks data 46400 1727204582.37610: processing included file: fedora.linux_system_roles.network 46400 1727204582.37631: in VariableManager get_vars() 46400 1727204582.37653: done with get_vars() 46400 1727204582.37684: in VariableManager get_vars() 46400 1727204582.37702: done with get_vars() 46400 1727204582.37740: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204582.37879: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204582.37956: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204582.38504: in VariableManager get_vars() 46400 1727204582.38625: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204582.43822: iterating over new_blocks loaded from include file 46400 1727204582.43825: in VariableManager get_vars() 46400 1727204582.43851: done with get_vars() 46400 1727204582.43853: filtering new block on tags 46400 1727204582.44230: done filtering new block on tags 46400 1727204582.44234: in VariableManager get_vars() 46400 1727204582.44252: done with get_vars() 46400 1727204582.44254: filtering new block on tags 46400 1727204582.44277: done filtering new block on tags 46400 1727204582.44280: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204582.44290: extending task lists for all hosts with included blocks 46400 1727204582.44483: done extending task lists 46400 1727204582.44485: done processing included files 46400 1727204582.44485: results queue empty 46400 1727204582.44486: checking for any_errors_fatal 46400 1727204582.44490: done checking for any_errors_fatal 46400 1727204582.44491: checking for max_fail_percentage 46400 1727204582.44492: done checking for max_fail_percentage 46400 1727204582.44493: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.44494: done checking to see if all hosts have failed 46400 1727204582.44495: getting the remaining hosts for this loop 46400 1727204582.44496: done getting the remaining hosts for this loop 46400 1727204582.44502: getting the next task for host managed-node2 46400 1727204582.44507: done getting next task for host managed-node2 46400 1727204582.44510: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204582.44514: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.44525: getting variables 46400 1727204582.44526: in VariableManager get_vars() 46400 1727204582.44541: Calling all_inventory to load vars for managed-node2 46400 1727204582.44544: Calling groups_inventory to load vars for managed-node2 46400 1727204582.44546: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.44552: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.44554: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.44558: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.47567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.49838: done with get_vars() 46400 1727204582.49886: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.223) 0:01:12.785 ***** 46400 1727204582.50096: entering _queue_task() for managed-node2/include_tasks 46400 1727204582.50675: worker is 1 (out of 1 available) 46400 1727204582.50739: exiting _queue_task() for managed-node2/include_tasks 46400 1727204582.50753: done queuing things up, now waiting for results queue to drain 46400 1727204582.50754: waiting for pending results... 46400 1727204582.51334: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204582.51633: in run() - task 0affcd87-79f5-1303-fda8-00000000183a 46400 1727204582.51678: variable 'ansible_search_path' from source: unknown 46400 1727204582.51739: variable 'ansible_search_path' from source: unknown 46400 1727204582.51837: calling self._execute() 46400 1727204582.52022: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.52075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.52090: variable 'omit' from source: magic vars 46400 1727204582.52717: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.52738: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.52749: _execute() done 46400 1727204582.52757: dumping result to json 46400 1727204582.52767: done dumping result, returning 46400 1727204582.52778: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-00000000183a] 46400 1727204582.52789: sending task result for task 0affcd87-79f5-1303-fda8-00000000183a 46400 1727204582.52958: no more pending results, returning what we have 46400 1727204582.52965: in VariableManager get_vars() 46400 1727204582.53027: Calling all_inventory to load vars for managed-node2 46400 1727204582.53030: Calling groups_inventory to load vars for managed-node2 46400 1727204582.53032: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.53097: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.53101: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.53105: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.64001: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183a 46400 1727204582.64006: WORKER PROCESS EXITING 46400 1727204582.65638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.68229: done with get_vars() 46400 1727204582.68273: variable 'ansible_search_path' from source: unknown 46400 1727204582.68275: variable 'ansible_search_path' from source: unknown 46400 1727204582.68314: we have included files to process 46400 1727204582.68315: generating all_blocks data 46400 1727204582.68316: done generating all_blocks data 46400 1727204582.68318: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204582.68319: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204582.68321: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204582.69208: done processing included file 46400 1727204582.69211: iterating over new_blocks loaded from include file 46400 1727204582.69212: in VariableManager get_vars() 46400 1727204582.69242: done with get_vars() 46400 1727204582.69244: filtering new block on tags 46400 1727204582.69279: done filtering new block on tags 46400 1727204582.69283: in VariableManager get_vars() 46400 1727204582.69308: done with get_vars() 46400 1727204582.69309: filtering new block on tags 46400 1727204582.69355: done filtering new block on tags 46400 1727204582.69357: in VariableManager get_vars() 46400 1727204582.69385: done with get_vars() 46400 1727204582.69387: filtering new block on tags 46400 1727204582.69433: done filtering new block on tags 46400 1727204582.69436: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204582.69440: extending task lists for all hosts with included blocks 46400 1727204582.72526: done extending task lists 46400 1727204582.72528: done processing included files 46400 1727204582.72529: results queue empty 46400 1727204582.72529: checking for any_errors_fatal 46400 1727204582.72533: done checking for any_errors_fatal 46400 1727204582.72533: checking for max_fail_percentage 46400 1727204582.72535: done checking for max_fail_percentage 46400 1727204582.72536: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.72536: done checking to see if all hosts have failed 46400 1727204582.72537: getting the remaining hosts for this loop 46400 1727204582.72538: done getting the remaining hosts for this loop 46400 1727204582.72541: getting the next task for host managed-node2 46400 1727204582.72546: done getting next task for host managed-node2 46400 1727204582.72549: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204582.72553: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.72571: getting variables 46400 1727204582.72572: in VariableManager get_vars() 46400 1727204582.72598: Calling all_inventory to load vars for managed-node2 46400 1727204582.72601: Calling groups_inventory to load vars for managed-node2 46400 1727204582.72604: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.72610: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.72612: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.72615: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.73979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.75657: done with get_vars() 46400 1727204582.75813: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.259) 0:01:13.044 ***** 46400 1727204582.76025: entering _queue_task() for managed-node2/setup 46400 1727204582.76888: worker is 1 (out of 1 available) 46400 1727204582.76902: exiting _queue_task() for managed-node2/setup 46400 1727204582.76916: done queuing things up, now waiting for results queue to drain 46400 1727204582.76918: waiting for pending results... 46400 1727204582.77630: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204582.77752: in run() - task 0affcd87-79f5-1303-fda8-000000001897 46400 1727204582.77767: variable 'ansible_search_path' from source: unknown 46400 1727204582.77772: variable 'ansible_search_path' from source: unknown 46400 1727204582.77804: calling self._execute() 46400 1727204582.77881: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.77885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.77897: variable 'omit' from source: magic vars 46400 1727204582.78193: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.78204: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.79003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204582.81481: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204582.81624: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204582.81723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204582.81753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204582.81780: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204582.81907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204582.81971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204582.82037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204582.82155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204582.82173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204582.82289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204582.82320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204582.82345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204582.82473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204582.82477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204582.82612: variable '__network_required_facts' from source: role '' defaults 46400 1727204582.82620: variable 'ansible_facts' from source: unknown 46400 1727204582.83252: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204582.83257: when evaluation is False, skipping this task 46400 1727204582.83281: _execute() done 46400 1727204582.83285: dumping result to json 46400 1727204582.83287: done dumping result, returning 46400 1727204582.83298: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000001897] 46400 1727204582.83301: sending task result for task 0affcd87-79f5-1303-fda8-000000001897 46400 1727204582.83476: done sending task result for task 0affcd87-79f5-1303-fda8-000000001897 46400 1727204582.83480: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204582.83561: no more pending results, returning what we have 46400 1727204582.83577: results queue empty 46400 1727204582.83578: checking for any_errors_fatal 46400 1727204582.83583: done checking for any_errors_fatal 46400 1727204582.83584: checking for max_fail_percentage 46400 1727204582.83614: done checking for max_fail_percentage 46400 1727204582.83616: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.83617: done checking to see if all hosts have failed 46400 1727204582.83618: getting the remaining hosts for this loop 46400 1727204582.83620: done getting the remaining hosts for this loop 46400 1727204582.83641: getting the next task for host managed-node2 46400 1727204582.83658: done getting next task for host managed-node2 46400 1727204582.83704: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204582.83712: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.83760: getting variables 46400 1727204582.83762: in VariableManager get_vars() 46400 1727204582.83803: Calling all_inventory to load vars for managed-node2 46400 1727204582.83805: Calling groups_inventory to load vars for managed-node2 46400 1727204582.83807: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.83817: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.83820: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.83828: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.86895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.88156: done with get_vars() 46400 1727204582.88184: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.122) 0:01:13.167 ***** 46400 1727204582.88263: entering _queue_task() for managed-node2/stat 46400 1727204582.88521: worker is 1 (out of 1 available) 46400 1727204582.88538: exiting _queue_task() for managed-node2/stat 46400 1727204582.88552: done queuing things up, now waiting for results queue to drain 46400 1727204582.88554: waiting for pending results... 46400 1727204582.88820: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204582.88936: in run() - task 0affcd87-79f5-1303-fda8-000000001899 46400 1727204582.88962: variable 'ansible_search_path' from source: unknown 46400 1727204582.88974: variable 'ansible_search_path' from source: unknown 46400 1727204582.88998: calling self._execute() 46400 1727204582.89125: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.89129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.89153: variable 'omit' from source: magic vars 46400 1727204582.89618: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.89623: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.89848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204582.90086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204582.90155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204582.90192: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204582.90243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204582.90440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204582.90470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204582.90494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204582.90516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204582.90651: variable '__network_is_ostree' from source: set_fact 46400 1727204582.90663: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204582.90670: when evaluation is False, skipping this task 46400 1727204582.90683: _execute() done 46400 1727204582.90686: dumping result to json 46400 1727204582.90689: done dumping result, returning 46400 1727204582.90691: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000001899] 46400 1727204582.90694: sending task result for task 0affcd87-79f5-1303-fda8-000000001899 46400 1727204582.90862: done sending task result for task 0affcd87-79f5-1303-fda8-000000001899 46400 1727204582.90869: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204582.91173: no more pending results, returning what we have 46400 1727204582.91188: results queue empty 46400 1727204582.91190: checking for any_errors_fatal 46400 1727204582.91198: done checking for any_errors_fatal 46400 1727204582.91199: checking for max_fail_percentage 46400 1727204582.91201: done checking for max_fail_percentage 46400 1727204582.91202: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.91203: done checking to see if all hosts have failed 46400 1727204582.91204: getting the remaining hosts for this loop 46400 1727204582.91207: done getting the remaining hosts for this loop 46400 1727204582.91211: getting the next task for host managed-node2 46400 1727204582.91222: done getting next task for host managed-node2 46400 1727204582.91229: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204582.91236: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.91257: getting variables 46400 1727204582.91263: in VariableManager get_vars() 46400 1727204582.91308: Calling all_inventory to load vars for managed-node2 46400 1727204582.91310: Calling groups_inventory to load vars for managed-node2 46400 1727204582.91312: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.91323: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.91329: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.91333: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.93727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204582.95024: done with get_vars() 46400 1727204582.95056: done getting variables 46400 1727204582.95106: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:02 -0400 (0:00:00.068) 0:01:13.235 ***** 46400 1727204582.95139: entering _queue_task() for managed-node2/set_fact 46400 1727204582.95423: worker is 1 (out of 1 available) 46400 1727204582.95437: exiting _queue_task() for managed-node2/set_fact 46400 1727204582.95450: done queuing things up, now waiting for results queue to drain 46400 1727204582.95451: waiting for pending results... 46400 1727204582.95726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204582.95943: in run() - task 0affcd87-79f5-1303-fda8-00000000189a 46400 1727204582.95982: variable 'ansible_search_path' from source: unknown 46400 1727204582.95992: variable 'ansible_search_path' from source: unknown 46400 1727204582.96035: calling self._execute() 46400 1727204582.96158: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204582.96167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204582.96188: variable 'omit' from source: magic vars 46400 1727204582.96575: variable 'ansible_distribution_major_version' from source: facts 46400 1727204582.96585: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204582.96912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204582.97256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204582.97313: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204582.97348: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204582.97395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204582.97571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204582.97592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204582.97610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204582.97628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204582.97749: variable '__network_is_ostree' from source: set_fact 46400 1727204582.97770: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204582.97773: when evaluation is False, skipping this task 46400 1727204582.97776: _execute() done 46400 1727204582.97778: dumping result to json 46400 1727204582.97794: done dumping result, returning 46400 1727204582.97797: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-00000000189a] 46400 1727204582.97799: sending task result for task 0affcd87-79f5-1303-fda8-00000000189a 46400 1727204582.97902: done sending task result for task 0affcd87-79f5-1303-fda8-00000000189a 46400 1727204582.97905: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204582.98076: no more pending results, returning what we have 46400 1727204582.98104: results queue empty 46400 1727204582.98105: checking for any_errors_fatal 46400 1727204582.98110: done checking for any_errors_fatal 46400 1727204582.98111: checking for max_fail_percentage 46400 1727204582.98112: done checking for max_fail_percentage 46400 1727204582.98113: checking to see if all hosts have failed and the running result is not ok 46400 1727204582.98114: done checking to see if all hosts have failed 46400 1727204582.98115: getting the remaining hosts for this loop 46400 1727204582.98116: done getting the remaining hosts for this loop 46400 1727204582.98120: getting the next task for host managed-node2 46400 1727204582.98139: done getting next task for host managed-node2 46400 1727204582.98143: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204582.98148: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204582.98168: getting variables 46400 1727204582.98169: in VariableManager get_vars() 46400 1727204582.98222: Calling all_inventory to load vars for managed-node2 46400 1727204582.98224: Calling groups_inventory to load vars for managed-node2 46400 1727204582.98226: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204582.98234: Calling all_plugins_play to load vars for managed-node2 46400 1727204582.98236: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204582.98239: Calling groups_plugins_play to load vars for managed-node2 46400 1727204582.99418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204583.01153: done with get_vars() 46400 1727204583.01184: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:03 -0400 (0:00:00.062) 0:01:13.298 ***** 46400 1727204583.01381: entering _queue_task() for managed-node2/service_facts 46400 1727204583.01813: worker is 1 (out of 1 available) 46400 1727204583.01829: exiting _queue_task() for managed-node2/service_facts 46400 1727204583.01845: done queuing things up, now waiting for results queue to drain 46400 1727204583.01847: waiting for pending results... 46400 1727204583.02122: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204583.02289: in run() - task 0affcd87-79f5-1303-fda8-00000000189c 46400 1727204583.02295: variable 'ansible_search_path' from source: unknown 46400 1727204583.02299: variable 'ansible_search_path' from source: unknown 46400 1727204583.02371: calling self._execute() 46400 1727204583.02544: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204583.02549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204583.02552: variable 'omit' from source: magic vars 46400 1727204583.02967: variable 'ansible_distribution_major_version' from source: facts 46400 1727204583.02984: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204583.02987: variable 'omit' from source: magic vars 46400 1727204583.03068: variable 'omit' from source: magic vars 46400 1727204583.03134: variable 'omit' from source: magic vars 46400 1727204583.03223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204583.03263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204583.03346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204583.03352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204583.03354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204583.03382: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204583.03386: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204583.03388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204583.03506: Set connection var ansible_shell_type to sh 46400 1727204583.03515: Set connection var ansible_shell_executable to /bin/sh 46400 1727204583.03520: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204583.03542: Set connection var ansible_connection to ssh 46400 1727204583.03546: Set connection var ansible_pipelining to False 46400 1727204583.03574: Set connection var ansible_timeout to 10 46400 1727204583.03576: variable 'ansible_shell_executable' from source: unknown 46400 1727204583.03579: variable 'ansible_connection' from source: unknown 46400 1727204583.03583: variable 'ansible_module_compression' from source: unknown 46400 1727204583.03585: variable 'ansible_shell_type' from source: unknown 46400 1727204583.03587: variable 'ansible_shell_executable' from source: unknown 46400 1727204583.03589: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204583.03593: variable 'ansible_pipelining' from source: unknown 46400 1727204583.03598: variable 'ansible_timeout' from source: unknown 46400 1727204583.03611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204583.03878: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204583.03882: variable 'omit' from source: magic vars 46400 1727204583.03885: starting attempt loop 46400 1727204583.03889: running the handler 46400 1727204583.03915: _low_level_execute_command(): starting 46400 1727204583.03926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204583.05106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204583.05131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.05137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.05341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.05345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.05347: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204583.05350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.05352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204583.05354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204583.05356: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204583.05358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.05360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.05362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.05583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.06890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204583.07049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204583.07055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204583.07129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204583.08792: stdout chunk (state=3): >>>/root <<< 46400 1727204583.08897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204583.08973: stderr chunk (state=3): >>><<< 46400 1727204583.08977: stdout chunk (state=3): >>><<< 46400 1727204583.09004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204583.09017: _low_level_execute_command(): starting 46400 1727204583.09023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417 `" && echo ansible-tmp-1727204583.0900455-51543-252034515663417="` echo /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417 `" ) && sleep 0' 46400 1727204583.09724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204583.09730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.09746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.09766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.09813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.09818: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204583.09848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.09861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204583.09894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204583.09898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204583.09907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.09923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.09954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.09957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.09967: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204583.09980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.10381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204583.10384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204583.10386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204583.10548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204583.12341: stdout chunk (state=3): >>>ansible-tmp-1727204583.0900455-51543-252034515663417=/root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417 <<< 46400 1727204583.12468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204583.12579: stderr chunk (state=3): >>><<< 46400 1727204583.12583: stdout chunk (state=3): >>><<< 46400 1727204583.12602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204583.0900455-51543-252034515663417=/root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204583.12670: variable 'ansible_module_compression' from source: unknown 46400 1727204583.12718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204583.12755: variable 'ansible_facts' from source: unknown 46400 1727204583.12842: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/AnsiballZ_service_facts.py 46400 1727204583.13127: Sending initial data 46400 1727204583.13130: Sent initial data (162 bytes) 46400 1727204583.14505: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204583.14564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.14570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.14572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.14575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.14577: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204583.14579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.14605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204583.14671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204583.14756: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204583.14765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.14783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.14787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.14789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.14795: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204583.14798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.14815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204583.14818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204583.14826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204583.14830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204583.16536: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204583.16602: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204583.16613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpi2a1f857 /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/AnsiballZ_service_facts.py <<< 46400 1727204583.16671: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204583.17964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204583.18063: stderr chunk (state=3): >>><<< 46400 1727204583.18068: stdout chunk (state=3): >>><<< 46400 1727204583.18094: done transferring module to remote 46400 1727204583.18104: _low_level_execute_command(): starting 46400 1727204583.18110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/ /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/AnsiballZ_service_facts.py && sleep 0' 46400 1727204583.19794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204583.19883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.19892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.19905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.20080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.20084: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204583.20103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.20106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204583.20114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204583.20121: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204583.20130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.20149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.20160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.20173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204583.20180: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204583.20190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.20274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204583.20291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204583.20298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204583.20370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204583.22086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204583.22171: stderr chunk (state=3): >>><<< 46400 1727204583.22175: stdout chunk (state=3): >>><<< 46400 1727204583.22191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204583.22196: _low_level_execute_command(): starting 46400 1727204583.22202: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/AnsiballZ_service_facts.py && sleep 0' 46400 1727204583.23578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.23588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204583.23628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.23632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204583.23649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204583.23656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204583.23738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204583.23752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204583.23757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204583.23836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.52553: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 46400 1727204584.52583: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 46400 1727204584.52622: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 46400 1727204584.52626: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 46400 1727204584.52631: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hiber<<< 46400 1727204584.52634: stdout chunk (state=3): >>>nate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204584.53912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204584.53917: stderr chunk (state=3): >>><<< 46400 1727204584.53919: stdout chunk (state=3): >>><<< 46400 1727204584.53954: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204584.54739: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204584.54748: _low_level_execute_command(): starting 46400 1727204584.54754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204583.0900455-51543-252034515663417/ > /dev/null 2>&1 && sleep 0' 46400 1727204584.56771: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.56776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.56824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.56828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.56865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.56873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.56888: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.57037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.57081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204584.57086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.57162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.59003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204584.59007: stderr chunk (state=3): >>><<< 46400 1727204584.59010: stdout chunk (state=3): >>><<< 46400 1727204584.59029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204584.59034: handler run complete 46400 1727204584.59222: variable 'ansible_facts' from source: unknown 46400 1727204584.59387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204584.60048: variable 'ansible_facts' from source: unknown 46400 1727204584.60291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204584.60700: attempt loop complete, returning result 46400 1727204584.60704: _execute() done 46400 1727204584.60709: dumping result to json 46400 1727204584.60767: done dumping result, returning 46400 1727204584.60892: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-00000000189c] 46400 1727204584.60897: sending task result for task 0affcd87-79f5-1303-fda8-00000000189c 46400 1727204584.62043: done sending task result for task 0affcd87-79f5-1303-fda8-00000000189c 46400 1727204584.62047: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204584.62142: no more pending results, returning what we have 46400 1727204584.62146: results queue empty 46400 1727204584.62147: checking for any_errors_fatal 46400 1727204584.62153: done checking for any_errors_fatal 46400 1727204584.62154: checking for max_fail_percentage 46400 1727204584.62156: done checking for max_fail_percentage 46400 1727204584.62157: checking to see if all hosts have failed and the running result is not ok 46400 1727204584.62158: done checking to see if all hosts have failed 46400 1727204584.62159: getting the remaining hosts for this loop 46400 1727204584.62165: done getting the remaining hosts for this loop 46400 1727204584.62170: getting the next task for host managed-node2 46400 1727204584.62177: done getting next task for host managed-node2 46400 1727204584.62182: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204584.62188: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204584.62202: getting variables 46400 1727204584.62204: in VariableManager get_vars() 46400 1727204584.62243: Calling all_inventory to load vars for managed-node2 46400 1727204584.62246: Calling groups_inventory to load vars for managed-node2 46400 1727204584.62248: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204584.62259: Calling all_plugins_play to load vars for managed-node2 46400 1727204584.62272: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204584.62275: Calling groups_plugins_play to load vars for managed-node2 46400 1727204584.65000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204584.68886: done with get_vars() 46400 1727204584.68920: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:04 -0400 (0:00:01.676) 0:01:14.974 ***** 46400 1727204584.69039: entering _queue_task() for managed-node2/package_facts 46400 1727204584.69532: worker is 1 (out of 1 available) 46400 1727204584.69544: exiting _queue_task() for managed-node2/package_facts 46400 1727204584.69563: done queuing things up, now waiting for results queue to drain 46400 1727204584.69566: waiting for pending results... 46400 1727204584.69901: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204584.70075: in run() - task 0affcd87-79f5-1303-fda8-00000000189d 46400 1727204584.70097: variable 'ansible_search_path' from source: unknown 46400 1727204584.70105: variable 'ansible_search_path' from source: unknown 46400 1727204584.70156: calling self._execute() 46400 1727204584.70283: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204584.70295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204584.70311: variable 'omit' from source: magic vars 46400 1727204584.70774: variable 'ansible_distribution_major_version' from source: facts 46400 1727204584.70805: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204584.70818: variable 'omit' from source: magic vars 46400 1727204584.70916: variable 'omit' from source: magic vars 46400 1727204584.70959: variable 'omit' from source: magic vars 46400 1727204584.71024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204584.71071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204584.71099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204584.71134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204584.71148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204584.71187: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204584.71196: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204584.71203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204584.71315: Set connection var ansible_shell_type to sh 46400 1727204584.71339: Set connection var ansible_shell_executable to /bin/sh 46400 1727204584.71353: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204584.71367: Set connection var ansible_connection to ssh 46400 1727204584.71377: Set connection var ansible_pipelining to False 46400 1727204584.71386: Set connection var ansible_timeout to 10 46400 1727204584.71417: variable 'ansible_shell_executable' from source: unknown 46400 1727204584.71427: variable 'ansible_connection' from source: unknown 46400 1727204584.71439: variable 'ansible_module_compression' from source: unknown 46400 1727204584.71452: variable 'ansible_shell_type' from source: unknown 46400 1727204584.71467: variable 'ansible_shell_executable' from source: unknown 46400 1727204584.71476: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204584.71483: variable 'ansible_pipelining' from source: unknown 46400 1727204584.71489: variable 'ansible_timeout' from source: unknown 46400 1727204584.71495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204584.71738: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204584.71758: variable 'omit' from source: magic vars 46400 1727204584.71777: starting attempt loop 46400 1727204584.71788: running the handler 46400 1727204584.71806: _low_level_execute_command(): starting 46400 1727204584.71819: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204584.72863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204584.72890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.72912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.72933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.72987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.73001: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204584.73016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.73038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204584.73050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204584.73066: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204584.73081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.73102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.73120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.73137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.73148: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204584.73168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.73250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.73277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204584.73292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.73367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.74950: stdout chunk (state=3): >>>/root <<< 46400 1727204584.75170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204584.75216: stdout chunk (state=3): >>><<< 46400 1727204584.75574: stderr chunk (state=3): >>><<< 46400 1727204584.76305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204584.76310: _low_level_execute_command(): starting 46400 1727204584.76313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794 `" && echo ansible-tmp-1727204584.7589405-51627-218587051540794="` echo /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794 `" ) && sleep 0' 46400 1727204584.76981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204584.76998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.77014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.77033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.77089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.77103: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204584.77118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.77137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204584.77150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204584.77161: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204584.77179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.77194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.77210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.77223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.77235: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204584.77251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.77330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.77354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204584.77375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.77456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.79297: stdout chunk (state=3): >>>ansible-tmp-1727204584.7589405-51627-218587051540794=/root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794 <<< 46400 1727204584.79474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204584.79478: stdout chunk (state=3): >>><<< 46400 1727204584.79481: stderr chunk (state=3): >>><<< 46400 1727204584.79497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204584.7589405-51627-218587051540794=/root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204584.79540: variable 'ansible_module_compression' from source: unknown 46400 1727204584.79585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204584.79635: variable 'ansible_facts' from source: unknown 46400 1727204584.79768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/AnsiballZ_package_facts.py 46400 1727204584.79894: Sending initial data 46400 1727204584.79897: Sent initial data (162 bytes) 46400 1727204584.80986: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204584.80990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.80992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.81005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.81045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.81052: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204584.81066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.81077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204584.81090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204584.81095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204584.81097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.81107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.81118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.81125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.81132: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204584.81140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.81213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.81232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204584.81245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.81308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.83009: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204584.83039: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204584.83081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpa8dkrgkd /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/AnsiballZ_package_facts.py <<< 46400 1727204584.83114: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204584.85152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204584.85414: stderr chunk (state=3): >>><<< 46400 1727204584.85428: stdout chunk (state=3): >>><<< 46400 1727204584.85481: done transferring module to remote 46400 1727204584.85530: _low_level_execute_command(): starting 46400 1727204584.85590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/ /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/AnsiballZ_package_facts.py && sleep 0' 46400 1727204584.86671: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204584.86686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.86731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.86785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.86820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.86868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.86872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.86939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204584.88944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204584.88949: stdout chunk (state=3): >>><<< 46400 1727204584.88951: stderr chunk (state=3): >>><<< 46400 1727204584.88953: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204584.88959: _low_level_execute_command(): starting 46400 1727204584.88963: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/AnsiballZ_package_facts.py && sleep 0' 46400 1727204584.90091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204584.90111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.90144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.90175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.90241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.90253: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204584.90287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.90309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204584.90327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204584.90338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204584.90350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204584.90371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204584.90388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204584.90418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204584.90437: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204584.90453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204584.90594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204584.90643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204584.90686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204584.90796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204585.37280: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204585.37389: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204585.37471: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204585.37481: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204585.39089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204585.39118: stderr chunk (state=3): >>><<< 46400 1727204585.39121: stdout chunk (state=3): >>><<< 46400 1727204585.39377: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204585.45170: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204585.45200: _low_level_execute_command(): starting 46400 1727204585.45210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204584.7589405-51627-218587051540794/ > /dev/null 2>&1 && sleep 0' 46400 1727204585.46020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204585.46037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204585.46054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204585.46081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204585.46131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204585.46145: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204585.46163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204585.46184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204585.46197: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204585.46208: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204585.46220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204585.46239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204585.46255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204585.46274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204585.46286: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204585.46300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204585.46384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204585.46409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204585.46426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204585.46502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204585.48385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204585.48448: stderr chunk (state=3): >>><<< 46400 1727204585.48452: stdout chunk (state=3): >>><<< 46400 1727204585.48579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204585.48584: handler run complete 46400 1727204585.49439: variable 'ansible_facts' from source: unknown 46400 1727204585.49904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.51292: variable 'ansible_facts' from source: unknown 46400 1727204585.51793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.52540: attempt loop complete, returning result 46400 1727204585.52558: _execute() done 46400 1727204585.52561: dumping result to json 46400 1727204585.52816: done dumping result, returning 46400 1727204585.52833: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-00000000189d] 46400 1727204585.52836: sending task result for task 0affcd87-79f5-1303-fda8-00000000189d ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204585.55170: done sending task result for task 0affcd87-79f5-1303-fda8-00000000189d 46400 1727204585.55174: WORKER PROCESS EXITING 46400 1727204585.55183: no more pending results, returning what we have 46400 1727204585.55185: results queue empty 46400 1727204585.55186: checking for any_errors_fatal 46400 1727204585.55190: done checking for any_errors_fatal 46400 1727204585.55191: checking for max_fail_percentage 46400 1727204585.55192: done checking for max_fail_percentage 46400 1727204585.55192: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.55193: done checking to see if all hosts have failed 46400 1727204585.55193: getting the remaining hosts for this loop 46400 1727204585.55194: done getting the remaining hosts for this loop 46400 1727204585.55197: getting the next task for host managed-node2 46400 1727204585.55203: done getting next task for host managed-node2 46400 1727204585.55206: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204585.55210: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.55218: getting variables 46400 1727204585.55219: in VariableManager get_vars() 46400 1727204585.55244: Calling all_inventory to load vars for managed-node2 46400 1727204585.55246: Calling groups_inventory to load vars for managed-node2 46400 1727204585.55252: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.55259: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.55261: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.55263: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.56018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.59205: done with get_vars() 46400 1727204585.59257: done getting variables 46400 1727204585.59326: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.903) 0:01:15.878 ***** 46400 1727204585.59380: entering _queue_task() for managed-node2/debug 46400 1727204585.59861: worker is 1 (out of 1 available) 46400 1727204585.59878: exiting _queue_task() for managed-node2/debug 46400 1727204585.59892: done queuing things up, now waiting for results queue to drain 46400 1727204585.59894: waiting for pending results... 46400 1727204585.60106: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204585.60208: in run() - task 0affcd87-79f5-1303-fda8-00000000183b 46400 1727204585.60220: variable 'ansible_search_path' from source: unknown 46400 1727204585.60225: variable 'ansible_search_path' from source: unknown 46400 1727204585.60256: calling self._execute() 46400 1727204585.60335: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.60339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.60352: variable 'omit' from source: magic vars 46400 1727204585.60642: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.60652: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.60658: variable 'omit' from source: magic vars 46400 1727204585.60714: variable 'omit' from source: magic vars 46400 1727204585.60788: variable 'network_provider' from source: set_fact 46400 1727204585.60803: variable 'omit' from source: magic vars 46400 1727204585.60841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204585.60871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204585.60889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204585.60905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204585.60915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204585.60940: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204585.60943: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.60946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.61016: Set connection var ansible_shell_type to sh 46400 1727204585.61025: Set connection var ansible_shell_executable to /bin/sh 46400 1727204585.61031: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204585.61038: Set connection var ansible_connection to ssh 46400 1727204585.61041: Set connection var ansible_pipelining to False 46400 1727204585.61046: Set connection var ansible_timeout to 10 46400 1727204585.61069: variable 'ansible_shell_executable' from source: unknown 46400 1727204585.61073: variable 'ansible_connection' from source: unknown 46400 1727204585.61075: variable 'ansible_module_compression' from source: unknown 46400 1727204585.61078: variable 'ansible_shell_type' from source: unknown 46400 1727204585.61080: variable 'ansible_shell_executable' from source: unknown 46400 1727204585.61082: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.61085: variable 'ansible_pipelining' from source: unknown 46400 1727204585.61087: variable 'ansible_timeout' from source: unknown 46400 1727204585.61091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.61200: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204585.61210: variable 'omit' from source: magic vars 46400 1727204585.61213: starting attempt loop 46400 1727204585.61216: running the handler 46400 1727204585.61256: handler run complete 46400 1727204585.61269: attempt loop complete, returning result 46400 1727204585.61272: _execute() done 46400 1727204585.61274: dumping result to json 46400 1727204585.61277: done dumping result, returning 46400 1727204585.61283: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-00000000183b] 46400 1727204585.61288: sending task result for task 0affcd87-79f5-1303-fda8-00000000183b 46400 1727204585.61383: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183b 46400 1727204585.61386: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204585.61453: no more pending results, returning what we have 46400 1727204585.61457: results queue empty 46400 1727204585.61458: checking for any_errors_fatal 46400 1727204585.61677: done checking for any_errors_fatal 46400 1727204585.61679: checking for max_fail_percentage 46400 1727204585.61681: done checking for max_fail_percentage 46400 1727204585.61682: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.61682: done checking to see if all hosts have failed 46400 1727204585.61683: getting the remaining hosts for this loop 46400 1727204585.61686: done getting the remaining hosts for this loop 46400 1727204585.61691: getting the next task for host managed-node2 46400 1727204585.61698: done getting next task for host managed-node2 46400 1727204585.61702: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204585.61708: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.61720: getting variables 46400 1727204585.61721: in VariableManager get_vars() 46400 1727204585.61770: Calling all_inventory to load vars for managed-node2 46400 1727204585.61773: Calling groups_inventory to load vars for managed-node2 46400 1727204585.61776: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.61786: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.61788: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.61791: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.63896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.66124: done with get_vars() 46400 1727204585.66163: done getting variables 46400 1727204585.66239: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.069) 0:01:15.947 ***** 46400 1727204585.66293: entering _queue_task() for managed-node2/fail 46400 1727204585.66687: worker is 1 (out of 1 available) 46400 1727204585.66701: exiting _queue_task() for managed-node2/fail 46400 1727204585.66719: done queuing things up, now waiting for results queue to drain 46400 1727204585.66721: waiting for pending results... 46400 1727204585.67050: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204585.67221: in run() - task 0affcd87-79f5-1303-fda8-00000000183c 46400 1727204585.67246: variable 'ansible_search_path' from source: unknown 46400 1727204585.67256: variable 'ansible_search_path' from source: unknown 46400 1727204585.67308: calling self._execute() 46400 1727204585.67419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.67430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.67444: variable 'omit' from source: magic vars 46400 1727204585.67881: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.67899: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.68041: variable 'network_state' from source: role '' defaults 46400 1727204585.68065: Evaluated conditional (network_state != {}): False 46400 1727204585.68074: when evaluation is False, skipping this task 46400 1727204585.68082: _execute() done 46400 1727204585.68089: dumping result to json 46400 1727204585.68096: done dumping result, returning 46400 1727204585.68106: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-00000000183c] 46400 1727204585.68116: sending task result for task 0affcd87-79f5-1303-fda8-00000000183c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204585.68287: no more pending results, returning what we have 46400 1727204585.68292: results queue empty 46400 1727204585.68293: checking for any_errors_fatal 46400 1727204585.68302: done checking for any_errors_fatal 46400 1727204585.68302: checking for max_fail_percentage 46400 1727204585.68305: done checking for max_fail_percentage 46400 1727204585.68306: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.68307: done checking to see if all hosts have failed 46400 1727204585.68308: getting the remaining hosts for this loop 46400 1727204585.68311: done getting the remaining hosts for this loop 46400 1727204585.68316: getting the next task for host managed-node2 46400 1727204585.68329: done getting next task for host managed-node2 46400 1727204585.68334: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204585.68341: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.68371: getting variables 46400 1727204585.68373: in VariableManager get_vars() 46400 1727204585.68421: Calling all_inventory to load vars for managed-node2 46400 1727204585.68425: Calling groups_inventory to load vars for managed-node2 46400 1727204585.68428: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.68573: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.68577: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.68582: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.69515: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183c 46400 1727204585.69519: WORKER PROCESS EXITING 46400 1727204585.70498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.71937: done with get_vars() 46400 1727204585.71965: done getting variables 46400 1727204585.72029: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.057) 0:01:16.005 ***** 46400 1727204585.72072: entering _queue_task() for managed-node2/fail 46400 1727204585.72427: worker is 1 (out of 1 available) 46400 1727204585.72442: exiting _queue_task() for managed-node2/fail 46400 1727204585.72460: done queuing things up, now waiting for results queue to drain 46400 1727204585.72463: waiting for pending results... 46400 1727204585.72732: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204585.72831: in run() - task 0affcd87-79f5-1303-fda8-00000000183d 46400 1727204585.72844: variable 'ansible_search_path' from source: unknown 46400 1727204585.72848: variable 'ansible_search_path' from source: unknown 46400 1727204585.72881: calling self._execute() 46400 1727204585.72953: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.72958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.72974: variable 'omit' from source: magic vars 46400 1727204585.73250: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.73262: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.73351: variable 'network_state' from source: role '' defaults 46400 1727204585.73362: Evaluated conditional (network_state != {}): False 46400 1727204585.73369: when evaluation is False, skipping this task 46400 1727204585.73372: _execute() done 46400 1727204585.73374: dumping result to json 46400 1727204585.73377: done dumping result, returning 46400 1727204585.73380: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-00000000183d] 46400 1727204585.73384: sending task result for task 0affcd87-79f5-1303-fda8-00000000183d 46400 1727204585.73486: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183d 46400 1727204585.73488: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204585.73541: no more pending results, returning what we have 46400 1727204585.73546: results queue empty 46400 1727204585.73547: checking for any_errors_fatal 46400 1727204585.73555: done checking for any_errors_fatal 46400 1727204585.73556: checking for max_fail_percentage 46400 1727204585.73558: done checking for max_fail_percentage 46400 1727204585.73559: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.73559: done checking to see if all hosts have failed 46400 1727204585.73563: getting the remaining hosts for this loop 46400 1727204585.73566: done getting the remaining hosts for this loop 46400 1727204585.73570: getting the next task for host managed-node2 46400 1727204585.73578: done getting next task for host managed-node2 46400 1727204585.73583: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204585.73588: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.73608: getting variables 46400 1727204585.73610: in VariableManager get_vars() 46400 1727204585.73656: Calling all_inventory to load vars for managed-node2 46400 1727204585.73659: Calling groups_inventory to load vars for managed-node2 46400 1727204585.73665: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.73674: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.73677: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.73679: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.74657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.76323: done with get_vars() 46400 1727204585.76361: done getting variables 46400 1727204585.76424: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.043) 0:01:16.049 ***** 46400 1727204585.76467: entering _queue_task() for managed-node2/fail 46400 1727204585.76772: worker is 1 (out of 1 available) 46400 1727204585.76787: exiting _queue_task() for managed-node2/fail 46400 1727204585.76801: done queuing things up, now waiting for results queue to drain 46400 1727204585.76803: waiting for pending results... 46400 1727204585.77009: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204585.77115: in run() - task 0affcd87-79f5-1303-fda8-00000000183e 46400 1727204585.77125: variable 'ansible_search_path' from source: unknown 46400 1727204585.77129: variable 'ansible_search_path' from source: unknown 46400 1727204585.77159: calling self._execute() 46400 1727204585.77243: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.77248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.77255: variable 'omit' from source: magic vars 46400 1727204585.77547: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.77556: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.77694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204585.80168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204585.80250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204585.80307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204585.80348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204585.80381: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204585.80481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.80526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.80559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.80610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.80640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.80770: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.80793: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204585.80807: when evaluation is False, skipping this task 46400 1727204585.80811: _execute() done 46400 1727204585.80819: dumping result to json 46400 1727204585.80822: done dumping result, returning 46400 1727204585.80833: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-00000000183e] 46400 1727204585.80859: sending task result for task 0affcd87-79f5-1303-fda8-00000000183e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204585.81005: no more pending results, returning what we have 46400 1727204585.81009: results queue empty 46400 1727204585.81010: checking for any_errors_fatal 46400 1727204585.81018: done checking for any_errors_fatal 46400 1727204585.81019: checking for max_fail_percentage 46400 1727204585.81020: done checking for max_fail_percentage 46400 1727204585.81021: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.81022: done checking to see if all hosts have failed 46400 1727204585.81023: getting the remaining hosts for this loop 46400 1727204585.81025: done getting the remaining hosts for this loop 46400 1727204585.81029: getting the next task for host managed-node2 46400 1727204585.81037: done getting next task for host managed-node2 46400 1727204585.81042: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204585.81046: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.81067: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183e 46400 1727204585.81073: WORKER PROCESS EXITING 46400 1727204585.81087: getting variables 46400 1727204585.81089: in VariableManager get_vars() 46400 1727204585.81135: Calling all_inventory to load vars for managed-node2 46400 1727204585.81138: Calling groups_inventory to load vars for managed-node2 46400 1727204585.81141: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.81151: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.81153: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.81156: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.82219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.83613: done with get_vars() 46400 1727204585.83647: done getting variables 46400 1727204585.83715: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.072) 0:01:16.122 ***** 46400 1727204585.83749: entering _queue_task() for managed-node2/dnf 46400 1727204585.84067: worker is 1 (out of 1 available) 46400 1727204585.84081: exiting _queue_task() for managed-node2/dnf 46400 1727204585.84094: done queuing things up, now waiting for results queue to drain 46400 1727204585.84095: waiting for pending results... 46400 1727204585.84293: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204585.84397: in run() - task 0affcd87-79f5-1303-fda8-00000000183f 46400 1727204585.84407: variable 'ansible_search_path' from source: unknown 46400 1727204585.84410: variable 'ansible_search_path' from source: unknown 46400 1727204585.84446: calling self._execute() 46400 1727204585.84519: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.84523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.84531: variable 'omit' from source: magic vars 46400 1727204585.84823: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.84834: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.84984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204585.87339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204585.87420: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204585.87465: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204585.87512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204585.87550: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204585.87670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.87694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.87730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.87780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.87805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.88001: variable 'ansible_distribution' from source: facts 46400 1727204585.88006: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.88009: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204585.88159: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204585.88305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.88324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.88344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.88394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.88411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.88463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.88509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.88528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.88575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.88596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.88654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.88684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.88717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.88771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.88794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.88999: variable 'network_connections' from source: include params 46400 1727204585.89007: variable 'interface' from source: play vars 46400 1727204585.89086: variable 'interface' from source: play vars 46400 1727204585.89190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204585.89353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204585.89386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204585.89408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204585.89461: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204585.89509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204585.89533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204585.89558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.89591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204585.89768: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204585.90097: variable 'network_connections' from source: include params 46400 1727204585.90112: variable 'interface' from source: play vars 46400 1727204585.90196: variable 'interface' from source: play vars 46400 1727204585.90243: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204585.90255: when evaluation is False, skipping this task 46400 1727204585.90267: _execute() done 46400 1727204585.90275: dumping result to json 46400 1727204585.90281: done dumping result, returning 46400 1727204585.90292: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000183f] 46400 1727204585.90301: sending task result for task 0affcd87-79f5-1303-fda8-00000000183f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204585.90474: no more pending results, returning what we have 46400 1727204585.90480: results queue empty 46400 1727204585.90481: checking for any_errors_fatal 46400 1727204585.90489: done checking for any_errors_fatal 46400 1727204585.90490: checking for max_fail_percentage 46400 1727204585.90492: done checking for max_fail_percentage 46400 1727204585.90493: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.90494: done checking to see if all hosts have failed 46400 1727204585.90495: getting the remaining hosts for this loop 46400 1727204585.90497: done getting the remaining hosts for this loop 46400 1727204585.90501: getting the next task for host managed-node2 46400 1727204585.90511: done getting next task for host managed-node2 46400 1727204585.90516: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204585.90521: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.90544: getting variables 46400 1727204585.90546: in VariableManager get_vars() 46400 1727204585.90600: Calling all_inventory to load vars for managed-node2 46400 1727204585.90603: Calling groups_inventory to load vars for managed-node2 46400 1727204585.90606: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204585.90617: Calling all_plugins_play to load vars for managed-node2 46400 1727204585.90620: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204585.90623: Calling groups_plugins_play to load vars for managed-node2 46400 1727204585.91314: done sending task result for task 0affcd87-79f5-1303-fda8-00000000183f 46400 1727204585.91317: WORKER PROCESS EXITING 46400 1727204585.91771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204585.93102: done with get_vars() 46400 1727204585.93150: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204585.93233: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:05 -0400 (0:00:00.095) 0:01:16.217 ***** 46400 1727204585.93292: entering _queue_task() for managed-node2/yum 46400 1727204585.94169: worker is 1 (out of 1 available) 46400 1727204585.94184: exiting _queue_task() for managed-node2/yum 46400 1727204585.94198: done queuing things up, now waiting for results queue to drain 46400 1727204585.94200: waiting for pending results... 46400 1727204585.94827: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204585.95028: in run() - task 0affcd87-79f5-1303-fda8-000000001840 46400 1727204585.95048: variable 'ansible_search_path' from source: unknown 46400 1727204585.95063: variable 'ansible_search_path' from source: unknown 46400 1727204585.95128: calling self._execute() 46400 1727204585.95261: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204585.95279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204585.95298: variable 'omit' from source: magic vars 46400 1727204585.96010: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.96034: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204585.96306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204585.99186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204585.99231: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204585.99277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204585.99311: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204585.99332: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204585.99418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204585.99438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204585.99457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204585.99519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204585.99524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204585.99626: variable 'ansible_distribution_major_version' from source: facts 46400 1727204585.99644: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204585.99647: when evaluation is False, skipping this task 46400 1727204585.99650: _execute() done 46400 1727204585.99652: dumping result to json 46400 1727204585.99654: done dumping result, returning 46400 1727204585.99661: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001840] 46400 1727204585.99676: sending task result for task 0affcd87-79f5-1303-fda8-000000001840 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204585.99855: no more pending results, returning what we have 46400 1727204585.99860: results queue empty 46400 1727204585.99861: checking for any_errors_fatal 46400 1727204585.99871: done checking for any_errors_fatal 46400 1727204585.99872: checking for max_fail_percentage 46400 1727204585.99876: done checking for max_fail_percentage 46400 1727204585.99877: checking to see if all hosts have failed and the running result is not ok 46400 1727204585.99878: done checking to see if all hosts have failed 46400 1727204585.99879: getting the remaining hosts for this loop 46400 1727204585.99881: done getting the remaining hosts for this loop 46400 1727204585.99885: getting the next task for host managed-node2 46400 1727204585.99895: done getting next task for host managed-node2 46400 1727204585.99900: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204585.99905: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204585.99933: getting variables 46400 1727204585.99935: in VariableManager get_vars() 46400 1727204585.99984: Calling all_inventory to load vars for managed-node2 46400 1727204585.99987: Calling groups_inventory to load vars for managed-node2 46400 1727204585.99990: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.00001: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.00003: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.00006: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.01386: done sending task result for task 0affcd87-79f5-1303-fda8-000000001840 46400 1727204586.01394: WORKER PROCESS EXITING 46400 1727204586.01405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.02342: done with get_vars() 46400 1727204586.02370: done getting variables 46400 1727204586.02418: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.091) 0:01:16.309 ***** 46400 1727204586.02445: entering _queue_task() for managed-node2/fail 46400 1727204586.02700: worker is 1 (out of 1 available) 46400 1727204586.02715: exiting _queue_task() for managed-node2/fail 46400 1727204586.02729: done queuing things up, now waiting for results queue to drain 46400 1727204586.02731: waiting for pending results... 46400 1727204586.02939: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204586.03044: in run() - task 0affcd87-79f5-1303-fda8-000000001841 46400 1727204586.03055: variable 'ansible_search_path' from source: unknown 46400 1727204586.03060: variable 'ansible_search_path' from source: unknown 46400 1727204586.03093: calling self._execute() 46400 1727204586.03172: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.03177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.03186: variable 'omit' from source: magic vars 46400 1727204586.03470: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.03480: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.03561: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.03843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204586.06103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204586.06150: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204586.06184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204586.06209: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204586.06229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204586.06297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.06317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.06335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.06366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.06382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.06414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.06430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.06448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.06482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.06493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.06523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.06538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.06554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.06586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.06598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.06719: variable 'network_connections' from source: include params 46400 1727204586.06730: variable 'interface' from source: play vars 46400 1727204586.06790: variable 'interface' from source: play vars 46400 1727204586.06842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204586.06976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204586.07006: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204586.07030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204586.07052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204586.07087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204586.07102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204586.07122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.07142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204586.07193: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204586.07366: variable 'network_connections' from source: include params 46400 1727204586.07372: variable 'interface' from source: play vars 46400 1727204586.07418: variable 'interface' from source: play vars 46400 1727204586.07443: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204586.07447: when evaluation is False, skipping this task 46400 1727204586.07451: _execute() done 46400 1727204586.07453: dumping result to json 46400 1727204586.07455: done dumping result, returning 46400 1727204586.07467: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001841] 46400 1727204586.07472: sending task result for task 0affcd87-79f5-1303-fda8-000000001841 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204586.07625: no more pending results, returning what we have 46400 1727204586.07629: results queue empty 46400 1727204586.07630: checking for any_errors_fatal 46400 1727204586.07635: done checking for any_errors_fatal 46400 1727204586.07636: checking for max_fail_percentage 46400 1727204586.07638: done checking for max_fail_percentage 46400 1727204586.07640: checking to see if all hosts have failed and the running result is not ok 46400 1727204586.07641: done checking to see if all hosts have failed 46400 1727204586.07641: getting the remaining hosts for this loop 46400 1727204586.07643: done getting the remaining hosts for this loop 46400 1727204586.07647: getting the next task for host managed-node2 46400 1727204586.07657: done getting next task for host managed-node2 46400 1727204586.07662: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204586.07667: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204586.07690: getting variables 46400 1727204586.07692: in VariableManager get_vars() 46400 1727204586.07734: Calling all_inventory to load vars for managed-node2 46400 1727204586.07737: Calling groups_inventory to load vars for managed-node2 46400 1727204586.07739: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.07749: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.07751: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.07753: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.08796: done sending task result for task 0affcd87-79f5-1303-fda8-000000001841 46400 1727204586.08800: WORKER PROCESS EXITING 46400 1727204586.09236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.10794: done with get_vars() 46400 1727204586.10813: done getting variables 46400 1727204586.10866: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.084) 0:01:16.393 ***** 46400 1727204586.10896: entering _queue_task() for managed-node2/package 46400 1727204586.11146: worker is 1 (out of 1 available) 46400 1727204586.11160: exiting _queue_task() for managed-node2/package 46400 1727204586.11175: done queuing things up, now waiting for results queue to drain 46400 1727204586.11176: waiting for pending results... 46400 1727204586.11388: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204586.11500: in run() - task 0affcd87-79f5-1303-fda8-000000001842 46400 1727204586.11515: variable 'ansible_search_path' from source: unknown 46400 1727204586.11519: variable 'ansible_search_path' from source: unknown 46400 1727204586.11547: calling self._execute() 46400 1727204586.11624: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.11636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.11640: variable 'omit' from source: magic vars 46400 1727204586.12070: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.12074: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.12169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204586.12439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204586.12485: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204586.12518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204586.12550: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204586.12675: variable 'network_packages' from source: role '' defaults 46400 1727204586.12783: variable '__network_provider_setup' from source: role '' defaults 46400 1727204586.12795: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204586.12860: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204586.12873: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204586.12989: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204586.13167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204586.14681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204586.14735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204586.14768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204586.14793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204586.14814: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204586.14879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.14899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.14917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.14943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.14960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.14993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.15009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.15025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.15051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.15064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.15220: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204586.15306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.15322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.15339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.15367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.15380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.15445: variable 'ansible_python' from source: facts 46400 1727204586.15459: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204586.15524: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204586.15585: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204586.15675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.15692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.15711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.15740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.15750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.15787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.15808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.15825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.15854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.15867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.15963: variable 'network_connections' from source: include params 46400 1727204586.15973: variable 'interface' from source: play vars 46400 1727204586.16043: variable 'interface' from source: play vars 46400 1727204586.16102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204586.16122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204586.16143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.16173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204586.16210: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.16396: variable 'network_connections' from source: include params 46400 1727204586.16399: variable 'interface' from source: play vars 46400 1727204586.16475: variable 'interface' from source: play vars 46400 1727204586.16516: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204586.16573: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.16779: variable 'network_connections' from source: include params 46400 1727204586.16785: variable 'interface' from source: play vars 46400 1727204586.16832: variable 'interface' from source: play vars 46400 1727204586.16851: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204586.16909: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204586.17116: variable 'network_connections' from source: include params 46400 1727204586.17119: variable 'interface' from source: play vars 46400 1727204586.17168: variable 'interface' from source: play vars 46400 1727204586.17212: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204586.17260: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204586.17265: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204586.17308: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204586.17462: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204586.17779: variable 'network_connections' from source: include params 46400 1727204586.17787: variable 'interface' from source: play vars 46400 1727204586.17829: variable 'interface' from source: play vars 46400 1727204586.17836: variable 'ansible_distribution' from source: facts 46400 1727204586.17839: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.17844: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.17869: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204586.17981: variable 'ansible_distribution' from source: facts 46400 1727204586.17984: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.17989: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.17997: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204586.18112: variable 'ansible_distribution' from source: facts 46400 1727204586.18115: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.18124: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.18143: variable 'network_provider' from source: set_fact 46400 1727204586.18154: variable 'ansible_facts' from source: unknown 46400 1727204586.18618: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204586.18621: when evaluation is False, skipping this task 46400 1727204586.18628: _execute() done 46400 1727204586.18631: dumping result to json 46400 1727204586.18633: done dumping result, returning 46400 1727204586.18636: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000001842] 46400 1727204586.18641: sending task result for task 0affcd87-79f5-1303-fda8-000000001842 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204586.18797: no more pending results, returning what we have 46400 1727204586.18801: results queue empty 46400 1727204586.18802: checking for any_errors_fatal 46400 1727204586.18810: done checking for any_errors_fatal 46400 1727204586.18811: checking for max_fail_percentage 46400 1727204586.18812: done checking for max_fail_percentage 46400 1727204586.18813: checking to see if all hosts have failed and the running result is not ok 46400 1727204586.18814: done checking to see if all hosts have failed 46400 1727204586.18815: getting the remaining hosts for this loop 46400 1727204586.18816: done getting the remaining hosts for this loop 46400 1727204586.18820: getting the next task for host managed-node2 46400 1727204586.18829: done getting next task for host managed-node2 46400 1727204586.18833: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204586.18837: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204586.18861: getting variables 46400 1727204586.18863: in VariableManager get_vars() 46400 1727204586.18907: Calling all_inventory to load vars for managed-node2 46400 1727204586.18910: Calling groups_inventory to load vars for managed-node2 46400 1727204586.18917: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.18927: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.18930: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.18932: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.19546: done sending task result for task 0affcd87-79f5-1303-fda8-000000001842 46400 1727204586.19549: WORKER PROCESS EXITING 46400 1727204586.19872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.21238: done with get_vars() 46400 1727204586.21271: done getting variables 46400 1727204586.21320: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.104) 0:01:16.498 ***** 46400 1727204586.21348: entering _queue_task() for managed-node2/package 46400 1727204586.21748: worker is 1 (out of 1 available) 46400 1727204586.21763: exiting _queue_task() for managed-node2/package 46400 1727204586.21778: done queuing things up, now waiting for results queue to drain 46400 1727204586.21780: waiting for pending results... 46400 1727204586.22107: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204586.22246: in run() - task 0affcd87-79f5-1303-fda8-000000001843 46400 1727204586.22250: variable 'ansible_search_path' from source: unknown 46400 1727204586.22253: variable 'ansible_search_path' from source: unknown 46400 1727204586.22311: calling self._execute() 46400 1727204586.22389: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.22393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.22403: variable 'omit' from source: magic vars 46400 1727204586.22795: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.22806: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.22930: variable 'network_state' from source: role '' defaults 46400 1727204586.22940: Evaluated conditional (network_state != {}): False 46400 1727204586.22943: when evaluation is False, skipping this task 46400 1727204586.22947: _execute() done 46400 1727204586.22950: dumping result to json 46400 1727204586.22953: done dumping result, returning 46400 1727204586.22958: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001843] 46400 1727204586.22970: sending task result for task 0affcd87-79f5-1303-fda8-000000001843 46400 1727204586.23082: done sending task result for task 0affcd87-79f5-1303-fda8-000000001843 46400 1727204586.23085: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204586.23139: no more pending results, returning what we have 46400 1727204586.23144: results queue empty 46400 1727204586.23146: checking for any_errors_fatal 46400 1727204586.23153: done checking for any_errors_fatal 46400 1727204586.23153: checking for max_fail_percentage 46400 1727204586.23156: done checking for max_fail_percentage 46400 1727204586.23157: checking to see if all hosts have failed and the running result is not ok 46400 1727204586.23158: done checking to see if all hosts have failed 46400 1727204586.23158: getting the remaining hosts for this loop 46400 1727204586.23163: done getting the remaining hosts for this loop 46400 1727204586.23169: getting the next task for host managed-node2 46400 1727204586.23179: done getting next task for host managed-node2 46400 1727204586.23184: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204586.23189: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204586.23216: getting variables 46400 1727204586.23218: in VariableManager get_vars() 46400 1727204586.23269: Calling all_inventory to load vars for managed-node2 46400 1727204586.23272: Calling groups_inventory to load vars for managed-node2 46400 1727204586.23275: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.23288: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.23292: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.23295: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.24508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.32382: done with get_vars() 46400 1727204586.32412: done getting variables 46400 1727204586.32603: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.112) 0:01:16.610 ***** 46400 1727204586.32640: entering _queue_task() for managed-node2/package 46400 1727204586.33021: worker is 1 (out of 1 available) 46400 1727204586.33034: exiting _queue_task() for managed-node2/package 46400 1727204586.33049: done queuing things up, now waiting for results queue to drain 46400 1727204586.33051: waiting for pending results... 46400 1727204586.33399: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204586.33576: in run() - task 0affcd87-79f5-1303-fda8-000000001844 46400 1727204586.33588: variable 'ansible_search_path' from source: unknown 46400 1727204586.33591: variable 'ansible_search_path' from source: unknown 46400 1727204586.33635: calling self._execute() 46400 1727204586.33744: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.33750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.33759: variable 'omit' from source: magic vars 46400 1727204586.34137: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.34148: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.34278: variable 'network_state' from source: role '' defaults 46400 1727204586.34294: Evaluated conditional (network_state != {}): False 46400 1727204586.34297: when evaluation is False, skipping this task 46400 1727204586.34300: _execute() done 46400 1727204586.34303: dumping result to json 46400 1727204586.34305: done dumping result, returning 46400 1727204586.34311: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001844] 46400 1727204586.34318: sending task result for task 0affcd87-79f5-1303-fda8-000000001844 46400 1727204586.34434: done sending task result for task 0affcd87-79f5-1303-fda8-000000001844 46400 1727204586.34437: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204586.34490: no more pending results, returning what we have 46400 1727204586.34494: results queue empty 46400 1727204586.34496: checking for any_errors_fatal 46400 1727204586.34504: done checking for any_errors_fatal 46400 1727204586.34505: checking for max_fail_percentage 46400 1727204586.34507: done checking for max_fail_percentage 46400 1727204586.34508: checking to see if all hosts have failed and the running result is not ok 46400 1727204586.34509: done checking to see if all hosts have failed 46400 1727204586.34510: getting the remaining hosts for this loop 46400 1727204586.34512: done getting the remaining hosts for this loop 46400 1727204586.34516: getting the next task for host managed-node2 46400 1727204586.34525: done getting next task for host managed-node2 46400 1727204586.34531: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204586.34537: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204586.34566: getting variables 46400 1727204586.34569: in VariableManager get_vars() 46400 1727204586.34612: Calling all_inventory to load vars for managed-node2 46400 1727204586.34615: Calling groups_inventory to load vars for managed-node2 46400 1727204586.34617: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.34629: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.34632: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.34635: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.38035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.39709: done with get_vars() 46400 1727204586.39742: done getting variables 46400 1727204586.39811: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.072) 0:01:16.683 ***** 46400 1727204586.39853: entering _queue_task() for managed-node2/service 46400 1727204586.40220: worker is 1 (out of 1 available) 46400 1727204586.40233: exiting _queue_task() for managed-node2/service 46400 1727204586.40247: done queuing things up, now waiting for results queue to drain 46400 1727204586.40249: waiting for pending results... 46400 1727204586.40584: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204586.41272: in run() - task 0affcd87-79f5-1303-fda8-000000001845 46400 1727204586.41297: variable 'ansible_search_path' from source: unknown 46400 1727204586.41305: variable 'ansible_search_path' from source: unknown 46400 1727204586.41346: calling self._execute() 46400 1727204586.41465: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.41480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.41495: variable 'omit' from source: magic vars 46400 1727204586.41920: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.41938: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.42083: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.42287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204586.47418: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204586.48389: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204586.48492: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204586.48592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204586.48625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204586.48743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.48924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.48957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.49193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.49275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.49445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.49659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.49694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.49736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.49787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.49865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.50114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.50142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.50189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.50323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.50834: variable 'network_connections' from source: include params 46400 1727204586.51038: variable 'interface' from source: play vars 46400 1727204586.51121: variable 'interface' from source: play vars 46400 1727204586.51256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204586.52765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204586.53004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204586.53106: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204586.53208: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204586.53413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204586.53443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204586.53577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.53611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204586.53801: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204586.54202: variable 'network_connections' from source: include params 46400 1727204586.54332: variable 'interface' from source: play vars 46400 1727204586.54407: variable 'interface' from source: play vars 46400 1727204586.54579: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204586.54586: when evaluation is False, skipping this task 46400 1727204586.54592: _execute() done 46400 1727204586.54597: dumping result to json 46400 1727204586.54603: done dumping result, returning 46400 1727204586.54612: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001845] 46400 1727204586.54621: sending task result for task 0affcd87-79f5-1303-fda8-000000001845 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204586.54809: no more pending results, returning what we have 46400 1727204586.54814: results queue empty 46400 1727204586.54815: checking for any_errors_fatal 46400 1727204586.54822: done checking for any_errors_fatal 46400 1727204586.54823: checking for max_fail_percentage 46400 1727204586.54825: done checking for max_fail_percentage 46400 1727204586.54826: checking to see if all hosts have failed and the running result is not ok 46400 1727204586.54827: done checking to see if all hosts have failed 46400 1727204586.54827: getting the remaining hosts for this loop 46400 1727204586.54829: done getting the remaining hosts for this loop 46400 1727204586.54833: getting the next task for host managed-node2 46400 1727204586.54843: done getting next task for host managed-node2 46400 1727204586.54848: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204586.54853: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204586.54879: getting variables 46400 1727204586.54881: in VariableManager get_vars() 46400 1727204586.54927: Calling all_inventory to load vars for managed-node2 46400 1727204586.54929: Calling groups_inventory to load vars for managed-node2 46400 1727204586.54932: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204586.54943: Calling all_plugins_play to load vars for managed-node2 46400 1727204586.54946: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204586.54949: Calling groups_plugins_play to load vars for managed-node2 46400 1727204586.56387: done sending task result for task 0affcd87-79f5-1303-fda8-000000001845 46400 1727204586.56392: WORKER PROCESS EXITING 46400 1727204586.58054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204586.60393: done with get_vars() 46400 1727204586.60425: done getting variables 46400 1727204586.60549: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:06 -0400 (0:00:00.207) 0:01:16.890 ***** 46400 1727204586.60593: entering _queue_task() for managed-node2/service 46400 1727204586.60953: worker is 1 (out of 1 available) 46400 1727204586.60969: exiting _queue_task() for managed-node2/service 46400 1727204586.60984: done queuing things up, now waiting for results queue to drain 46400 1727204586.60985: waiting for pending results... 46400 1727204586.61478: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204586.61656: in run() - task 0affcd87-79f5-1303-fda8-000000001846 46400 1727204586.61683: variable 'ansible_search_path' from source: unknown 46400 1727204586.61691: variable 'ansible_search_path' from source: unknown 46400 1727204586.61729: calling self._execute() 46400 1727204586.61849: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.61872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.61888: variable 'omit' from source: magic vars 46400 1727204586.62356: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.62381: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204586.62830: variable 'network_provider' from source: set_fact 46400 1727204586.62843: variable 'network_state' from source: role '' defaults 46400 1727204586.62859: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204586.62880: variable 'omit' from source: magic vars 46400 1727204586.62951: variable 'omit' from source: magic vars 46400 1727204586.62990: variable 'network_service_name' from source: role '' defaults 46400 1727204586.63066: variable 'network_service_name' from source: role '' defaults 46400 1727204586.63184: variable '__network_provider_setup' from source: role '' defaults 46400 1727204586.63440: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204586.63506: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204586.63519: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204586.63585: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204586.63993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204586.67134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204586.67241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204586.67306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204586.67344: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204586.67440: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204586.67535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.67553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.67581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.67631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.67647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.67699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.67729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.67754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.67797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.67815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.68089: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204586.68221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.68243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.68321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.68325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.68338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.68447: variable 'ansible_python' from source: facts 46400 1727204586.68474: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204586.68565: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204586.68686: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204586.69248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.69277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.69303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.69350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.69368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.69412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204586.69436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204586.69468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.69598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204586.69612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204586.69772: variable 'network_connections' from source: include params 46400 1727204586.69784: variable 'interface' from source: play vars 46400 1727204586.69872: variable 'interface' from source: play vars 46400 1727204586.70009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204586.70208: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204586.70295: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204586.70346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204586.70388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204586.70462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204586.70491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204586.70523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204586.70568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204586.70700: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.71371: variable 'network_connections' from source: include params 46400 1727204586.71375: variable 'interface' from source: play vars 46400 1727204586.71455: variable 'interface' from source: play vars 46400 1727204586.71532: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204586.71605: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204586.72060: variable 'network_connections' from source: include params 46400 1727204586.72065: variable 'interface' from source: play vars 46400 1727204586.72115: variable 'interface' from source: play vars 46400 1727204586.72145: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204586.72232: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204586.72591: variable 'network_connections' from source: include params 46400 1727204586.72594: variable 'interface' from source: play vars 46400 1727204586.72686: variable 'interface' from source: play vars 46400 1727204586.73520: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204586.73587: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204586.73594: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204586.73860: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204586.74249: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204586.75506: variable 'network_connections' from source: include params 46400 1727204586.75518: variable 'interface' from source: play vars 46400 1727204586.75600: variable 'interface' from source: play vars 46400 1727204586.75616: variable 'ansible_distribution' from source: facts 46400 1727204586.75625: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.75635: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.75673: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204586.75851: variable 'ansible_distribution' from source: facts 46400 1727204586.75860: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.75873: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.75888: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204586.76074: variable 'ansible_distribution' from source: facts 46400 1727204586.76082: variable '__network_rh_distros' from source: role '' defaults 46400 1727204586.76091: variable 'ansible_distribution_major_version' from source: facts 46400 1727204586.76130: variable 'network_provider' from source: set_fact 46400 1727204586.76161: variable 'omit' from source: magic vars 46400 1727204586.76199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204586.76235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204586.76260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204586.76286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204586.76301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204586.76332: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204586.76342: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.76377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.76479: Set connection var ansible_shell_type to sh 46400 1727204586.76587: Set connection var ansible_shell_executable to /bin/sh 46400 1727204586.76598: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204586.76607: Set connection var ansible_connection to ssh 46400 1727204586.76616: Set connection var ansible_pipelining to False 46400 1727204586.76625: Set connection var ansible_timeout to 10 46400 1727204586.76659: variable 'ansible_shell_executable' from source: unknown 46400 1727204586.76698: variable 'ansible_connection' from source: unknown 46400 1727204586.76706: variable 'ansible_module_compression' from source: unknown 46400 1727204586.76712: variable 'ansible_shell_type' from source: unknown 46400 1727204586.76718: variable 'ansible_shell_executable' from source: unknown 46400 1727204586.76723: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204586.76730: variable 'ansible_pipelining' from source: unknown 46400 1727204586.76735: variable 'ansible_timeout' from source: unknown 46400 1727204586.76743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204586.76861: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204586.76889: variable 'omit' from source: magic vars 46400 1727204586.76902: starting attempt loop 46400 1727204586.76908: running the handler 46400 1727204586.76992: variable 'ansible_facts' from source: unknown 46400 1727204586.77822: _low_level_execute_command(): starting 46400 1727204586.77904: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204586.79219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.79228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.79331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.79335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204586.79348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.79354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204586.79372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.79438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204586.79451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204586.79460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204586.79534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204586.81209: stdout chunk (state=3): >>>/root <<< 46400 1727204586.81372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204586.81378: stderr chunk (state=3): >>><<< 46400 1727204586.81382: stdout chunk (state=3): >>><<< 46400 1727204586.81406: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204586.81418: _low_level_execute_command(): starting 46400 1727204586.81425: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095 `" && echo ansible-tmp-1727204586.814059-51741-119593594203095="` echo /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095 `" ) && sleep 0' 46400 1727204586.82110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204586.82118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.82129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.82141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.82188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.82195: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204586.82204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.82215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204586.82222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204586.82228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204586.82235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.82243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.82254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.82260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.82270: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204586.82291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.82361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204586.82379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204586.82389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204586.82479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204586.84320: stdout chunk (state=3): >>>ansible-tmp-1727204586.814059-51741-119593594203095=/root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095 <<< 46400 1727204586.84518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204586.84521: stdout chunk (state=3): >>><<< 46400 1727204586.84528: stderr chunk (state=3): >>><<< 46400 1727204586.84549: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204586.814059-51741-119593594203095=/root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204586.84588: variable 'ansible_module_compression' from source: unknown 46400 1727204586.84645: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204586.84711: variable 'ansible_facts' from source: unknown 46400 1727204586.85220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/AnsiballZ_systemd.py 46400 1727204586.85910: Sending initial data 46400 1727204586.85914: Sent initial data (155 bytes) 46400 1727204586.86905: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204586.86914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.86924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.86938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.86992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.86996: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204586.87003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.87016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204586.87023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204586.87029: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204586.87037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.87046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.87058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.87077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.87083: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204586.87092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.87179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204586.87198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204586.87210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204586.87280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204586.88987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204586.89018: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204586.89059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpl775avqn /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/AnsiballZ_systemd.py <<< 46400 1727204586.89099: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204586.91546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204586.91640: stderr chunk (state=3): >>><<< 46400 1727204586.91643: stdout chunk (state=3): >>><<< 46400 1727204586.91670: done transferring module to remote 46400 1727204586.91681: _low_level_execute_command(): starting 46400 1727204586.91686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/ /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/AnsiballZ_systemd.py && sleep 0' 46400 1727204586.93349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.93354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.93393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.93408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204586.93413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.93425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204586.93430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.93509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204586.93515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204586.93530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204586.93591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204586.95386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204586.95389: stdout chunk (state=3): >>><<< 46400 1727204586.95391: stderr chunk (state=3): >>><<< 46400 1727204586.95493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204586.95496: _low_level_execute_command(): starting 46400 1727204586.95499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/AnsiballZ_systemd.py && sleep 0' 46400 1727204586.96084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204586.96093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.96104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.96118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.96156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.96166: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204586.96178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.96192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204586.96200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204586.96207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204586.96214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204586.96227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204586.96235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204586.96243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204586.96249: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204586.96259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204586.96343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204586.96350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204586.96357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204586.96433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.21532: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204587.21568: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6922240", "MemoryAvailable": "infinity", "CPUUsageNSec": "2172118000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204587.21585: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204587.23141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204587.23145: stdout chunk (state=3): >>><<< 46400 1727204587.23147: stderr chunk (state=3): >>><<< 46400 1727204587.23276: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6922240", "MemoryAvailable": "infinity", "CPUUsageNSec": "2172118000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204587.23386: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204587.23411: _low_level_execute_command(): starting 46400 1727204587.23421: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204586.814059-51741-119593594203095/ > /dev/null 2>&1 && sleep 0' 46400 1727204587.24105: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204587.24120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.24135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.24157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.24204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.24216: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204587.24230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.24251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204587.24265: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204587.24277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204587.24289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.24303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.24318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.24330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.24341: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204587.24356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.24436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204587.24460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.24481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.24552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.26384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204587.26455: stderr chunk (state=3): >>><<< 46400 1727204587.26458: stdout chunk (state=3): >>><<< 46400 1727204587.26771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204587.26775: handler run complete 46400 1727204587.26778: attempt loop complete, returning result 46400 1727204587.26780: _execute() done 46400 1727204587.26782: dumping result to json 46400 1727204587.26784: done dumping result, returning 46400 1727204587.26786: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000001846] 46400 1727204587.26788: sending task result for task 0affcd87-79f5-1303-fda8-000000001846 46400 1727204587.26951: done sending task result for task 0affcd87-79f5-1303-fda8-000000001846 46400 1727204587.26954: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204587.27016: no more pending results, returning what we have 46400 1727204587.27021: results queue empty 46400 1727204587.27022: checking for any_errors_fatal 46400 1727204587.27028: done checking for any_errors_fatal 46400 1727204587.27029: checking for max_fail_percentage 46400 1727204587.27031: done checking for max_fail_percentage 46400 1727204587.27032: checking to see if all hosts have failed and the running result is not ok 46400 1727204587.27033: done checking to see if all hosts have failed 46400 1727204587.27034: getting the remaining hosts for this loop 46400 1727204587.27036: done getting the remaining hosts for this loop 46400 1727204587.27040: getting the next task for host managed-node2 46400 1727204587.27049: done getting next task for host managed-node2 46400 1727204587.27054: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204587.27059: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204587.27076: getting variables 46400 1727204587.27078: in VariableManager get_vars() 46400 1727204587.27114: Calling all_inventory to load vars for managed-node2 46400 1727204587.27116: Calling groups_inventory to load vars for managed-node2 46400 1727204587.27118: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204587.27127: Calling all_plugins_play to load vars for managed-node2 46400 1727204587.27130: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204587.27132: Calling groups_plugins_play to load vars for managed-node2 46400 1727204587.28936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204587.30741: done with get_vars() 46400 1727204587.30769: done getting variables 46400 1727204587.30836: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.702) 0:01:17.593 ***** 46400 1727204587.30879: entering _queue_task() for managed-node2/service 46400 1727204587.31233: worker is 1 (out of 1 available) 46400 1727204587.31250: exiting _queue_task() for managed-node2/service 46400 1727204587.31267: done queuing things up, now waiting for results queue to drain 46400 1727204587.31269: waiting for pending results... 46400 1727204587.31586: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204587.31751: in run() - task 0affcd87-79f5-1303-fda8-000000001847 46400 1727204587.31774: variable 'ansible_search_path' from source: unknown 46400 1727204587.31783: variable 'ansible_search_path' from source: unknown 46400 1727204587.31829: calling self._execute() 46400 1727204587.31942: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.31954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.31971: variable 'omit' from source: magic vars 46400 1727204587.32365: variable 'ansible_distribution_major_version' from source: facts 46400 1727204587.32385: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204587.32509: variable 'network_provider' from source: set_fact 46400 1727204587.32518: Evaluated conditional (network_provider == "nm"): True 46400 1727204587.32603: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204587.32692: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204587.32869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204587.35179: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204587.35260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204587.35310: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204587.35350: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204587.35385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204587.35488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204587.35528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204587.35561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204587.35610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204587.35636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204587.35690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204587.35718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204587.35752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204587.35799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204587.35819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204587.35869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204587.35898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204587.35925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204587.35976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204587.35996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204587.36166: variable 'network_connections' from source: include params 46400 1727204587.36186: variable 'interface' from source: play vars 46400 1727204587.36254: variable 'interface' from source: play vars 46400 1727204587.36327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204587.36507: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204587.36547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204587.36588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204587.36624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204587.36672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204587.36702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204587.36734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204587.36766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204587.36824: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204587.37096: variable 'network_connections' from source: include params 46400 1727204587.37107: variable 'interface' from source: play vars 46400 1727204587.37180: variable 'interface' from source: play vars 46400 1727204587.37225: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204587.37232: when evaluation is False, skipping this task 46400 1727204587.37239: _execute() done 46400 1727204587.37248: dumping result to json 46400 1727204587.37257: done dumping result, returning 46400 1727204587.37269: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000001847] 46400 1727204587.37291: sending task result for task 0affcd87-79f5-1303-fda8-000000001847 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204587.37448: no more pending results, returning what we have 46400 1727204587.37452: results queue empty 46400 1727204587.37453: checking for any_errors_fatal 46400 1727204587.37484: done checking for any_errors_fatal 46400 1727204587.37486: checking for max_fail_percentage 46400 1727204587.37488: done checking for max_fail_percentage 46400 1727204587.37489: checking to see if all hosts have failed and the running result is not ok 46400 1727204587.37490: done checking to see if all hosts have failed 46400 1727204587.37491: getting the remaining hosts for this loop 46400 1727204587.37493: done getting the remaining hosts for this loop 46400 1727204587.37498: getting the next task for host managed-node2 46400 1727204587.37509: done getting next task for host managed-node2 46400 1727204587.37514: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204587.37519: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204587.37541: getting variables 46400 1727204587.37543: in VariableManager get_vars() 46400 1727204587.37588: Calling all_inventory to load vars for managed-node2 46400 1727204587.37591: Calling groups_inventory to load vars for managed-node2 46400 1727204587.37593: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204587.37603: Calling all_plugins_play to load vars for managed-node2 46400 1727204587.37606: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204587.37608: Calling groups_plugins_play to load vars for managed-node2 46400 1727204587.38585: done sending task result for task 0affcd87-79f5-1303-fda8-000000001847 46400 1727204587.38590: WORKER PROCESS EXITING 46400 1727204587.39437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204587.41003: done with get_vars() 46400 1727204587.41028: done getting variables 46400 1727204587.41078: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.102) 0:01:17.695 ***** 46400 1727204587.41105: entering _queue_task() for managed-node2/service 46400 1727204587.41355: worker is 1 (out of 1 available) 46400 1727204587.41370: exiting _queue_task() for managed-node2/service 46400 1727204587.41384: done queuing things up, now waiting for results queue to drain 46400 1727204587.41386: waiting for pending results... 46400 1727204587.41584: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204587.41688: in run() - task 0affcd87-79f5-1303-fda8-000000001848 46400 1727204587.41698: variable 'ansible_search_path' from source: unknown 46400 1727204587.41703: variable 'ansible_search_path' from source: unknown 46400 1727204587.41733: calling self._execute() 46400 1727204587.41814: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.41818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.41828: variable 'omit' from source: magic vars 46400 1727204587.42119: variable 'ansible_distribution_major_version' from source: facts 46400 1727204587.42129: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204587.42214: variable 'network_provider' from source: set_fact 46400 1727204587.42220: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204587.42222: when evaluation is False, skipping this task 46400 1727204587.42225: _execute() done 46400 1727204587.42228: dumping result to json 46400 1727204587.42230: done dumping result, returning 46400 1727204587.42237: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000001848] 46400 1727204587.42243: sending task result for task 0affcd87-79f5-1303-fda8-000000001848 46400 1727204587.42336: done sending task result for task 0affcd87-79f5-1303-fda8-000000001848 46400 1727204587.42339: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204587.42451: no more pending results, returning what we have 46400 1727204587.42455: results queue empty 46400 1727204587.42456: checking for any_errors_fatal 46400 1727204587.42466: done checking for any_errors_fatal 46400 1727204587.42467: checking for max_fail_percentage 46400 1727204587.42469: done checking for max_fail_percentage 46400 1727204587.42470: checking to see if all hosts have failed and the running result is not ok 46400 1727204587.42470: done checking to see if all hosts have failed 46400 1727204587.42471: getting the remaining hosts for this loop 46400 1727204587.42473: done getting the remaining hosts for this loop 46400 1727204587.42574: getting the next task for host managed-node2 46400 1727204587.42583: done getting next task for host managed-node2 46400 1727204587.42587: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204587.42592: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204587.42612: getting variables 46400 1727204587.42614: in VariableManager get_vars() 46400 1727204587.42652: Calling all_inventory to load vars for managed-node2 46400 1727204587.42655: Calling groups_inventory to load vars for managed-node2 46400 1727204587.42658: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204587.42670: Calling all_plugins_play to load vars for managed-node2 46400 1727204587.42673: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204587.42676: Calling groups_plugins_play to load vars for managed-node2 46400 1727204587.44136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204587.45037: done with get_vars() 46400 1727204587.45055: done getting variables 46400 1727204587.45102: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.040) 0:01:17.735 ***** 46400 1727204587.45130: entering _queue_task() for managed-node2/copy 46400 1727204587.45376: worker is 1 (out of 1 available) 46400 1727204587.45390: exiting _queue_task() for managed-node2/copy 46400 1727204587.45405: done queuing things up, now waiting for results queue to drain 46400 1727204587.45406: waiting for pending results... 46400 1727204587.45605: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204587.45725: in run() - task 0affcd87-79f5-1303-fda8-000000001849 46400 1727204587.45735: variable 'ansible_search_path' from source: unknown 46400 1727204587.45738: variable 'ansible_search_path' from source: unknown 46400 1727204587.45771: calling self._execute() 46400 1727204587.45890: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.45894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.45897: variable 'omit' from source: magic vars 46400 1727204587.46283: variable 'ansible_distribution_major_version' from source: facts 46400 1727204587.46295: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204587.46413: variable 'network_provider' from source: set_fact 46400 1727204587.46418: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204587.46422: when evaluation is False, skipping this task 46400 1727204587.46425: _execute() done 46400 1727204587.46428: dumping result to json 46400 1727204587.46430: done dumping result, returning 46400 1727204587.46437: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000001849] 46400 1727204587.46443: sending task result for task 0affcd87-79f5-1303-fda8-000000001849 46400 1727204587.46558: done sending task result for task 0affcd87-79f5-1303-fda8-000000001849 46400 1727204587.46561: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204587.46610: no more pending results, returning what we have 46400 1727204587.46614: results queue empty 46400 1727204587.46615: checking for any_errors_fatal 46400 1727204587.46623: done checking for any_errors_fatal 46400 1727204587.46624: checking for max_fail_percentage 46400 1727204587.46626: done checking for max_fail_percentage 46400 1727204587.46627: checking to see if all hosts have failed and the running result is not ok 46400 1727204587.46628: done checking to see if all hosts have failed 46400 1727204587.46628: getting the remaining hosts for this loop 46400 1727204587.46630: done getting the remaining hosts for this loop 46400 1727204587.46636: getting the next task for host managed-node2 46400 1727204587.46647: done getting next task for host managed-node2 46400 1727204587.46652: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204587.46658: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204587.46685: getting variables 46400 1727204587.46687: in VariableManager get_vars() 46400 1727204587.46725: Calling all_inventory to load vars for managed-node2 46400 1727204587.46728: Calling groups_inventory to load vars for managed-node2 46400 1727204587.46730: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204587.46741: Calling all_plugins_play to load vars for managed-node2 46400 1727204587.46744: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204587.46746: Calling groups_plugins_play to load vars for managed-node2 46400 1727204587.48158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204587.49099: done with get_vars() 46400 1727204587.49120: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.040) 0:01:17.776 ***** 46400 1727204587.49192: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204587.49441: worker is 1 (out of 1 available) 46400 1727204587.49455: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204587.49471: done queuing things up, now waiting for results queue to drain 46400 1727204587.49473: waiting for pending results... 46400 1727204587.49675: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204587.49847: in run() - task 0affcd87-79f5-1303-fda8-00000000184a 46400 1727204587.49872: variable 'ansible_search_path' from source: unknown 46400 1727204587.49876: variable 'ansible_search_path' from source: unknown 46400 1727204587.50191: calling self._execute() 46400 1727204587.50196: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.50199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.50202: variable 'omit' from source: magic vars 46400 1727204587.50481: variable 'ansible_distribution_major_version' from source: facts 46400 1727204587.50486: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204587.50489: variable 'omit' from source: magic vars 46400 1727204587.50492: variable 'omit' from source: magic vars 46400 1727204587.50776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204587.53107: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204587.53192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204587.53242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204587.53291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204587.53322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204587.53420: variable 'network_provider' from source: set_fact 46400 1727204587.53599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204587.53634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204587.53678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204587.53730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204587.53752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204587.53848: variable 'omit' from source: magic vars 46400 1727204587.53979: variable 'omit' from source: magic vars 46400 1727204587.54388: variable 'network_connections' from source: include params 46400 1727204587.54405: variable 'interface' from source: play vars 46400 1727204587.54483: variable 'interface' from source: play vars 46400 1727204587.54678: variable 'omit' from source: magic vars 46400 1727204587.54699: variable '__lsr_ansible_managed' from source: task vars 46400 1727204587.54771: variable '__lsr_ansible_managed' from source: task vars 46400 1727204587.54986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204587.55232: Loaded config def from plugin (lookup/template) 46400 1727204587.55248: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204587.55283: File lookup term: get_ansible_managed.j2 46400 1727204587.55290: variable 'ansible_search_path' from source: unknown 46400 1727204587.55300: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204587.55316: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204587.55337: variable 'ansible_search_path' from source: unknown 46400 1727204587.62451: variable 'ansible_managed' from source: unknown 46400 1727204587.62628: variable 'omit' from source: magic vars 46400 1727204587.62670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204587.62710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204587.62734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204587.62757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204587.62778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204587.62819: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204587.62828: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.62837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.62946: Set connection var ansible_shell_type to sh 46400 1727204587.62965: Set connection var ansible_shell_executable to /bin/sh 46400 1727204587.62977: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204587.62987: Set connection var ansible_connection to ssh 46400 1727204587.62997: Set connection var ansible_pipelining to False 46400 1727204587.63014: Set connection var ansible_timeout to 10 46400 1727204587.63045: variable 'ansible_shell_executable' from source: unknown 46400 1727204587.63053: variable 'ansible_connection' from source: unknown 46400 1727204587.63063: variable 'ansible_module_compression' from source: unknown 46400 1727204587.63074: variable 'ansible_shell_type' from source: unknown 46400 1727204587.63081: variable 'ansible_shell_executable' from source: unknown 46400 1727204587.63089: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204587.63097: variable 'ansible_pipelining' from source: unknown 46400 1727204587.63104: variable 'ansible_timeout' from source: unknown 46400 1727204587.63119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204587.63271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204587.63297: variable 'omit' from source: magic vars 46400 1727204587.63308: starting attempt loop 46400 1727204587.63315: running the handler 46400 1727204587.63340: _low_level_execute_command(): starting 46400 1727204587.63352: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204587.64144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204587.64166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.64183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.64203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.64254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.64273: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204587.64289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.64307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204587.64321: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204587.64337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204587.64351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.64371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.64389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.64403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.64416: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204587.64431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.64517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204587.64535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.64555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.64637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.66298: stdout chunk (state=3): >>>/root <<< 46400 1727204587.66398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204587.66476: stderr chunk (state=3): >>><<< 46400 1727204587.66480: stdout chunk (state=3): >>><<< 46400 1727204587.66506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204587.66520: _low_level_execute_command(): starting 46400 1727204587.66528: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082 `" && echo ansible-tmp-1727204587.6650648-51857-126216070930082="` echo /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082 `" ) && sleep 0' 46400 1727204587.67180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204587.67188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.67200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.67215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.67257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.67267: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204587.67278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.67293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204587.67301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204587.67306: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204587.67314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.67323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.67335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.67342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.67348: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204587.67358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.67430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204587.67445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.67450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.67537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.69444: stdout chunk (state=3): >>>ansible-tmp-1727204587.6650648-51857-126216070930082=/root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082 <<< 46400 1727204587.69513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204587.69617: stderr chunk (state=3): >>><<< 46400 1727204587.69629: stdout chunk (state=3): >>><<< 46400 1727204587.69774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204587.6650648-51857-126216070930082=/root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204587.69784: variable 'ansible_module_compression' from source: unknown 46400 1727204587.69787: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204587.69883: variable 'ansible_facts' from source: unknown 46400 1727204587.69924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/AnsiballZ_network_connections.py 46400 1727204587.70100: Sending initial data 46400 1727204587.70103: Sent initial data (168 bytes) 46400 1727204587.71220: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204587.71233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.71247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.71271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.71319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.71333: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204587.71349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.71372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204587.71383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204587.71392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204587.71402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.71413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.71431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.71443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204587.71453: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204587.71470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.71550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204587.71577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.71592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.71663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.73365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204587.73430: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204587.73448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp2vw3ut7x /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/AnsiballZ_network_connections.py <<< 46400 1727204587.73487: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204587.76086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204587.76222: stderr chunk (state=3): >>><<< 46400 1727204587.76225: stdout chunk (state=3): >>><<< 46400 1727204587.76228: done transferring module to remote 46400 1727204587.76230: _low_level_execute_command(): starting 46400 1727204587.76232: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/ /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/AnsiballZ_network_connections.py && sleep 0' 46400 1727204587.77825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.77829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.77924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.77928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.77930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.78194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204587.78198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.78228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.78285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204587.80028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204587.80117: stderr chunk (state=3): >>><<< 46400 1727204587.80121: stdout chunk (state=3): >>><<< 46400 1727204587.80223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204587.80227: _low_level_execute_command(): starting 46400 1727204587.80230: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/AnsiballZ_network_connections.py && sleep 0' 46400 1727204587.81830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204587.81859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204587.81868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204587.81955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204587.81960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204587.81967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204587.82031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204587.82170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204587.82226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.08392: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204588.09894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204588.09971: stderr chunk (state=3): >>><<< 46400 1727204588.09989: stdout chunk (state=3): >>><<< 46400 1727204588.10134: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204588.10138: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204588.10140: _low_level_execute_command(): starting 46400 1727204588.10143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204587.6650648-51857-126216070930082/ > /dev/null 2>&1 && sleep 0' 46400 1727204588.11678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.11683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.11701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204588.11704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.11831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.11834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.11836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.11909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.11913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.12049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.12207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.14000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.14083: stderr chunk (state=3): >>><<< 46400 1727204588.14087: stdout chunk (state=3): >>><<< 46400 1727204588.14270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204588.14278: handler run complete 46400 1727204588.14281: attempt loop complete, returning result 46400 1727204588.14283: _execute() done 46400 1727204588.14285: dumping result to json 46400 1727204588.14287: done dumping result, returning 46400 1727204588.14289: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-00000000184a] 46400 1727204588.14291: sending task result for task 0affcd87-79f5-1303-fda8-00000000184a changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b 46400 1727204588.14475: no more pending results, returning what we have 46400 1727204588.14479: results queue empty 46400 1727204588.14480: checking for any_errors_fatal 46400 1727204588.14486: done checking for any_errors_fatal 46400 1727204588.14487: checking for max_fail_percentage 46400 1727204588.14489: done checking for max_fail_percentage 46400 1727204588.14489: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.14490: done checking to see if all hosts have failed 46400 1727204588.14491: getting the remaining hosts for this loop 46400 1727204588.14493: done getting the remaining hosts for this loop 46400 1727204588.14497: getting the next task for host managed-node2 46400 1727204588.14505: done getting next task for host managed-node2 46400 1727204588.14508: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204588.14514: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.14526: getting variables 46400 1727204588.14528: in VariableManager get_vars() 46400 1727204588.14574: Calling all_inventory to load vars for managed-node2 46400 1727204588.14578: Calling groups_inventory to load vars for managed-node2 46400 1727204588.14580: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.14590: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.14592: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.14595: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.15189: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184a 46400 1727204588.15193: WORKER PROCESS EXITING 46400 1727204588.17024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.19835: done with get_vars() 46400 1727204588.20090: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.710) 0:01:18.486 ***** 46400 1727204588.20227: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204588.20930: worker is 1 (out of 1 available) 46400 1727204588.20944: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204588.20957: done queuing things up, now waiting for results queue to drain 46400 1727204588.20958: waiting for pending results... 46400 1727204588.21270: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204588.21441: in run() - task 0affcd87-79f5-1303-fda8-00000000184b 46400 1727204588.21467: variable 'ansible_search_path' from source: unknown 46400 1727204588.21476: variable 'ansible_search_path' from source: unknown 46400 1727204588.21520: calling self._execute() 46400 1727204588.21624: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.21637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.21654: variable 'omit' from source: magic vars 46400 1727204588.22047: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.22072: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.22203: variable 'network_state' from source: role '' defaults 46400 1727204588.22220: Evaluated conditional (network_state != {}): False 46400 1727204588.22228: when evaluation is False, skipping this task 46400 1727204588.22235: _execute() done 46400 1727204588.22243: dumping result to json 46400 1727204588.22250: done dumping result, returning 46400 1727204588.22262: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-00000000184b] 46400 1727204588.22293: sending task result for task 0affcd87-79f5-1303-fda8-00000000184b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204588.22457: no more pending results, returning what we have 46400 1727204588.22468: results queue empty 46400 1727204588.22470: checking for any_errors_fatal 46400 1727204588.22484: done checking for any_errors_fatal 46400 1727204588.22485: checking for max_fail_percentage 46400 1727204588.22487: done checking for max_fail_percentage 46400 1727204588.22488: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.22489: done checking to see if all hosts have failed 46400 1727204588.22489: getting the remaining hosts for this loop 46400 1727204588.22492: done getting the remaining hosts for this loop 46400 1727204588.22496: getting the next task for host managed-node2 46400 1727204588.22506: done getting next task for host managed-node2 46400 1727204588.22512: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204588.22519: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.22544: getting variables 46400 1727204588.22546: in VariableManager get_vars() 46400 1727204588.22595: Calling all_inventory to load vars for managed-node2 46400 1727204588.22599: Calling groups_inventory to load vars for managed-node2 46400 1727204588.22601: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.22614: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.22617: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.22620: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.23697: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184b 46400 1727204588.23701: WORKER PROCESS EXITING 46400 1727204588.24493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.27005: done with get_vars() 46400 1727204588.27039: done getting variables 46400 1727204588.27109: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.069) 0:01:18.556 ***** 46400 1727204588.27147: entering _queue_task() for managed-node2/debug 46400 1727204588.27508: worker is 1 (out of 1 available) 46400 1727204588.27522: exiting _queue_task() for managed-node2/debug 46400 1727204588.27538: done queuing things up, now waiting for results queue to drain 46400 1727204588.27539: waiting for pending results... 46400 1727204588.27859: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204588.28020: in run() - task 0affcd87-79f5-1303-fda8-00000000184c 46400 1727204588.28043: variable 'ansible_search_path' from source: unknown 46400 1727204588.28052: variable 'ansible_search_path' from source: unknown 46400 1727204588.28104: calling self._execute() 46400 1727204588.28209: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.28221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.28239: variable 'omit' from source: magic vars 46400 1727204588.28656: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.28683: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.28694: variable 'omit' from source: magic vars 46400 1727204588.28770: variable 'omit' from source: magic vars 46400 1727204588.28809: variable 'omit' from source: magic vars 46400 1727204588.28866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204588.28907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204588.28932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204588.28955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.28977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.29012: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204588.29021: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.29028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.29133: Set connection var ansible_shell_type to sh 46400 1727204588.29149: Set connection var ansible_shell_executable to /bin/sh 46400 1727204588.29163: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204588.29178: Set connection var ansible_connection to ssh 46400 1727204588.29192: Set connection var ansible_pipelining to False 46400 1727204588.29203: Set connection var ansible_timeout to 10 46400 1727204588.29232: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.29241: variable 'ansible_connection' from source: unknown 46400 1727204588.29250: variable 'ansible_module_compression' from source: unknown 46400 1727204588.29257: variable 'ansible_shell_type' from source: unknown 46400 1727204588.29276: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.29285: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.29298: variable 'ansible_pipelining' from source: unknown 46400 1727204588.29305: variable 'ansible_timeout' from source: unknown 46400 1727204588.29313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.29467: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204588.29486: variable 'omit' from source: magic vars 46400 1727204588.29497: starting attempt loop 46400 1727204588.29506: running the handler 46400 1727204588.29647: variable '__network_connections_result' from source: set_fact 46400 1727204588.29695: handler run complete 46400 1727204588.29709: attempt loop complete, returning result 46400 1727204588.29712: _execute() done 46400 1727204588.29714: dumping result to json 46400 1727204588.29717: done dumping result, returning 46400 1727204588.29723: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-00000000184c] 46400 1727204588.29733: sending task result for task 0affcd87-79f5-1303-fda8-00000000184c 46400 1727204588.29825: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184c 46400 1727204588.29828: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b" ] } 46400 1727204588.29900: no more pending results, returning what we have 46400 1727204588.29904: results queue empty 46400 1727204588.29906: checking for any_errors_fatal 46400 1727204588.29912: done checking for any_errors_fatal 46400 1727204588.29913: checking for max_fail_percentage 46400 1727204588.29915: done checking for max_fail_percentage 46400 1727204588.29916: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.29917: done checking to see if all hosts have failed 46400 1727204588.29917: getting the remaining hosts for this loop 46400 1727204588.29919: done getting the remaining hosts for this loop 46400 1727204588.29923: getting the next task for host managed-node2 46400 1727204588.29930: done getting next task for host managed-node2 46400 1727204588.29934: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204588.29942: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.29955: getting variables 46400 1727204588.29956: in VariableManager get_vars() 46400 1727204588.29996: Calling all_inventory to load vars for managed-node2 46400 1727204588.29999: Calling groups_inventory to load vars for managed-node2 46400 1727204588.30001: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.30011: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.30013: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.30015: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.30962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.32452: done with get_vars() 46400 1727204588.32486: done getting variables 46400 1727204588.32581: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.054) 0:01:18.610 ***** 46400 1727204588.32623: entering _queue_task() for managed-node2/debug 46400 1727204588.32879: worker is 1 (out of 1 available) 46400 1727204588.32893: exiting _queue_task() for managed-node2/debug 46400 1727204588.32906: done queuing things up, now waiting for results queue to drain 46400 1727204588.32908: waiting for pending results... 46400 1727204588.33111: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204588.33207: in run() - task 0affcd87-79f5-1303-fda8-00000000184d 46400 1727204588.33218: variable 'ansible_search_path' from source: unknown 46400 1727204588.33223: variable 'ansible_search_path' from source: unknown 46400 1727204588.33252: calling self._execute() 46400 1727204588.33328: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.33333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.33345: variable 'omit' from source: magic vars 46400 1727204588.33633: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.33644: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.33649: variable 'omit' from source: magic vars 46400 1727204588.33699: variable 'omit' from source: magic vars 46400 1727204588.33722: variable 'omit' from source: magic vars 46400 1727204588.33759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204588.33790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204588.33808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204588.33821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.33829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.33854: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204588.33858: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.33863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.33930: Set connection var ansible_shell_type to sh 46400 1727204588.33938: Set connection var ansible_shell_executable to /bin/sh 46400 1727204588.33945: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204588.33948: Set connection var ansible_connection to ssh 46400 1727204588.33953: Set connection var ansible_pipelining to False 46400 1727204588.33959: Set connection var ansible_timeout to 10 46400 1727204588.33979: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.33983: variable 'ansible_connection' from source: unknown 46400 1727204588.33986: variable 'ansible_module_compression' from source: unknown 46400 1727204588.33988: variable 'ansible_shell_type' from source: unknown 46400 1727204588.33990: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.33994: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.33996: variable 'ansible_pipelining' from source: unknown 46400 1727204588.33998: variable 'ansible_timeout' from source: unknown 46400 1727204588.34000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.34105: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204588.34114: variable 'omit' from source: magic vars 46400 1727204588.34121: starting attempt loop 46400 1727204588.34124: running the handler 46400 1727204588.34163: variable '__network_connections_result' from source: set_fact 46400 1727204588.34224: variable '__network_connections_result' from source: set_fact 46400 1727204588.34312: handler run complete 46400 1727204588.34332: attempt loop complete, returning result 46400 1727204588.34335: _execute() done 46400 1727204588.34338: dumping result to json 46400 1727204588.34343: done dumping result, returning 46400 1727204588.34348: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-00000000184d] 46400 1727204588.34355: sending task result for task 0affcd87-79f5-1303-fda8-00000000184d 46400 1727204588.34483: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184d 46400 1727204588.34488: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b" ] } } 46400 1727204588.34855: no more pending results, returning what we have 46400 1727204588.34858: results queue empty 46400 1727204588.34859: checking for any_errors_fatal 46400 1727204588.34869: done checking for any_errors_fatal 46400 1727204588.34870: checking for max_fail_percentage 46400 1727204588.34872: done checking for max_fail_percentage 46400 1727204588.34873: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.34874: done checking to see if all hosts have failed 46400 1727204588.34874: getting the remaining hosts for this loop 46400 1727204588.34876: done getting the remaining hosts for this loop 46400 1727204588.34880: getting the next task for host managed-node2 46400 1727204588.34887: done getting next task for host managed-node2 46400 1727204588.34892: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204588.34897: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.34909: getting variables 46400 1727204588.34911: in VariableManager get_vars() 46400 1727204588.34953: Calling all_inventory to load vars for managed-node2 46400 1727204588.34956: Calling groups_inventory to load vars for managed-node2 46400 1727204588.34958: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.34973: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.34976: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.34979: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.36409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.38282: done with get_vars() 46400 1727204588.38306: done getting variables 46400 1727204588.38374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.057) 0:01:18.668 ***** 46400 1727204588.38410: entering _queue_task() for managed-node2/debug 46400 1727204588.38777: worker is 1 (out of 1 available) 46400 1727204588.38791: exiting _queue_task() for managed-node2/debug 46400 1727204588.38804: done queuing things up, now waiting for results queue to drain 46400 1727204588.38806: waiting for pending results... 46400 1727204588.39120: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204588.39283: in run() - task 0affcd87-79f5-1303-fda8-00000000184e 46400 1727204588.39304: variable 'ansible_search_path' from source: unknown 46400 1727204588.39312: variable 'ansible_search_path' from source: unknown 46400 1727204588.39353: calling self._execute() 46400 1727204588.39456: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.39477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.39494: variable 'omit' from source: magic vars 46400 1727204588.39892: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.39914: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.40047: variable 'network_state' from source: role '' defaults 46400 1727204588.40069: Evaluated conditional (network_state != {}): False 46400 1727204588.40078: when evaluation is False, skipping this task 46400 1727204588.40086: _execute() done 46400 1727204588.40094: dumping result to json 46400 1727204588.40103: done dumping result, returning 46400 1727204588.40114: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-00000000184e] 46400 1727204588.40131: sending task result for task 0affcd87-79f5-1303-fda8-00000000184e skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204588.40295: no more pending results, returning what we have 46400 1727204588.40300: results queue empty 46400 1727204588.40301: checking for any_errors_fatal 46400 1727204588.40311: done checking for any_errors_fatal 46400 1727204588.40313: checking for max_fail_percentage 46400 1727204588.40314: done checking for max_fail_percentage 46400 1727204588.40315: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.40316: done checking to see if all hosts have failed 46400 1727204588.40317: getting the remaining hosts for this loop 46400 1727204588.40319: done getting the remaining hosts for this loop 46400 1727204588.40323: getting the next task for host managed-node2 46400 1727204588.40334: done getting next task for host managed-node2 46400 1727204588.40339: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204588.40347: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.40377: getting variables 46400 1727204588.40379: in VariableManager get_vars() 46400 1727204588.40424: Calling all_inventory to load vars for managed-node2 46400 1727204588.40427: Calling groups_inventory to load vars for managed-node2 46400 1727204588.40430: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.40446: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.40449: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.40452: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.41385: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184e 46400 1727204588.41389: WORKER PROCESS EXITING 46400 1727204588.42266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.43990: done with get_vars() 46400 1727204588.44028: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.057) 0:01:18.725 ***** 46400 1727204588.44135: entering _queue_task() for managed-node2/ping 46400 1727204588.44498: worker is 1 (out of 1 available) 46400 1727204588.44513: exiting _queue_task() for managed-node2/ping 46400 1727204588.44525: done queuing things up, now waiting for results queue to drain 46400 1727204588.44527: waiting for pending results... 46400 1727204588.44843: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204588.45012: in run() - task 0affcd87-79f5-1303-fda8-00000000184f 46400 1727204588.45035: variable 'ansible_search_path' from source: unknown 46400 1727204588.45043: variable 'ansible_search_path' from source: unknown 46400 1727204588.45094: calling self._execute() 46400 1727204588.45206: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.45219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.45235: variable 'omit' from source: magic vars 46400 1727204588.45643: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.45666: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.45678: variable 'omit' from source: magic vars 46400 1727204588.45756: variable 'omit' from source: magic vars 46400 1727204588.45799: variable 'omit' from source: magic vars 46400 1727204588.45848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204588.45893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204588.45919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204588.45941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.45965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.46000: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204588.46008: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.46015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.46125: Set connection var ansible_shell_type to sh 46400 1727204588.46143: Set connection var ansible_shell_executable to /bin/sh 46400 1727204588.46154: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204588.46172: Set connection var ansible_connection to ssh 46400 1727204588.46186: Set connection var ansible_pipelining to False 46400 1727204588.46198: Set connection var ansible_timeout to 10 46400 1727204588.46229: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.46239: variable 'ansible_connection' from source: unknown 46400 1727204588.46247: variable 'ansible_module_compression' from source: unknown 46400 1727204588.46254: variable 'ansible_shell_type' from source: unknown 46400 1727204588.46266: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.46277: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.46288: variable 'ansible_pipelining' from source: unknown 46400 1727204588.46296: variable 'ansible_timeout' from source: unknown 46400 1727204588.46305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.46534: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204588.46553: variable 'omit' from source: magic vars 46400 1727204588.46569: starting attempt loop 46400 1727204588.46578: running the handler 46400 1727204588.46598: _low_level_execute_command(): starting 46400 1727204588.46616: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204588.47413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.47429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.47444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.47468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.47518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.47531: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204588.47544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.47567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204588.47580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204588.47591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204588.47609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.47625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.47644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.47658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.47677: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204588.47692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.47776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.47800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.47818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.47902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.49559: stdout chunk (state=3): >>>/root <<< 46400 1727204588.49677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.49778: stderr chunk (state=3): >>><<< 46400 1727204588.49793: stdout chunk (state=3): >>><<< 46400 1727204588.49871: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204588.49875: _low_level_execute_command(): starting 46400 1727204588.49878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842 `" && echo ansible-tmp-1727204588.4983425-51903-18559145172842="` echo /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842 `" ) && sleep 0' 46400 1727204588.50595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.50613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.50635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.50653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.50698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.50711: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204588.50729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.50754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204588.50771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204588.50781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204588.50792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.50804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.50819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.50830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.50846: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204588.50868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.50937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.50973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.50991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.51073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.52921: stdout chunk (state=3): >>>ansible-tmp-1727204588.4983425-51903-18559145172842=/root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842 <<< 46400 1727204588.53032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.53130: stderr chunk (state=3): >>><<< 46400 1727204588.53143: stdout chunk (state=3): >>><<< 46400 1727204588.53376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204588.4983425-51903-18559145172842=/root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204588.53379: variable 'ansible_module_compression' from source: unknown 46400 1727204588.53382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204588.53384: variable 'ansible_facts' from source: unknown 46400 1727204588.53406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/AnsiballZ_ping.py 46400 1727204588.53575: Sending initial data 46400 1727204588.53578: Sent initial data (152 bytes) 46400 1727204588.54665: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.54692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.54709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.54728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.54776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.54791: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204588.54812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.54830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204588.54843: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204588.54854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204588.54870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.54883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.54898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.54918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.54928: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204588.54939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.55013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.55042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.55058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.55137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.56923: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204588.56928: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204588.56988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmplh11rjne /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/AnsiballZ_ping.py <<< 46400 1727204588.57019: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204588.58066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.58378: stderr chunk (state=3): >>><<< 46400 1727204588.58382: stdout chunk (state=3): >>><<< 46400 1727204588.58384: done transferring module to remote 46400 1727204588.58387: _low_level_execute_command(): starting 46400 1727204588.58394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/ /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/AnsiballZ_ping.py && sleep 0' 46400 1727204588.59008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.59032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.59048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.59074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.59115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.59127: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204588.59146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.59169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204588.59182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204588.59193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204588.59204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.59217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.59232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.59245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.59266: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204588.59282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.59358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.59391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.59408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.59485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.61212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.61325: stderr chunk (state=3): >>><<< 46400 1727204588.61348: stdout chunk (state=3): >>><<< 46400 1727204588.61476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204588.61480: _low_level_execute_command(): starting 46400 1727204588.61483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/AnsiballZ_ping.py && sleep 0' 46400 1727204588.62116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.62138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.62153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.62175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.62217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.62229: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204588.62248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.62271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204588.62284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204588.62295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204588.62307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.62320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.62335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.62355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204588.62371: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204588.62385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.62473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.62496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.62512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.62595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.75626: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204588.76690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204588.76693: stdout chunk (state=3): >>><<< 46400 1727204588.76696: stderr chunk (state=3): >>><<< 46400 1727204588.76824: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204588.76829: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204588.76835: _low_level_execute_command(): starting 46400 1727204588.76837: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204588.4983425-51903-18559145172842/ > /dev/null 2>&1 && sleep 0' 46400 1727204588.77456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204588.77485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.77489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204588.77526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.77529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204588.77531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204588.77534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204588.77595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204588.77603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204588.77606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204588.77642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204588.79426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204588.79504: stderr chunk (state=3): >>><<< 46400 1727204588.79510: stdout chunk (state=3): >>><<< 46400 1727204588.79534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204588.79540: handler run complete 46400 1727204588.79565: attempt loop complete, returning result 46400 1727204588.79569: _execute() done 46400 1727204588.79571: dumping result to json 46400 1727204588.79578: done dumping result, returning 46400 1727204588.79587: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-00000000184f] 46400 1727204588.79592: sending task result for task 0affcd87-79f5-1303-fda8-00000000184f 46400 1727204588.79698: done sending task result for task 0affcd87-79f5-1303-fda8-00000000184f 46400 1727204588.79700: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204588.79839: no more pending results, returning what we have 46400 1727204588.79843: results queue empty 46400 1727204588.79845: checking for any_errors_fatal 46400 1727204588.79853: done checking for any_errors_fatal 46400 1727204588.79854: checking for max_fail_percentage 46400 1727204588.79856: done checking for max_fail_percentage 46400 1727204588.79857: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.79858: done checking to see if all hosts have failed 46400 1727204588.79859: getting the remaining hosts for this loop 46400 1727204588.79860: done getting the remaining hosts for this loop 46400 1727204588.79866: getting the next task for host managed-node2 46400 1727204588.79880: done getting next task for host managed-node2 46400 1727204588.79883: ^ task is: TASK: meta (role_complete) 46400 1727204588.79888: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.79905: getting variables 46400 1727204588.79907: in VariableManager get_vars() 46400 1727204588.79956: Calling all_inventory to load vars for managed-node2 46400 1727204588.79959: Calling groups_inventory to load vars for managed-node2 46400 1727204588.79962: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.79974: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.79977: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.79980: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.81148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.82152: done with get_vars() 46400 1727204588.82191: done getting variables 46400 1727204588.82311: done queuing things up, now waiting for results queue to drain 46400 1727204588.82312: results queue empty 46400 1727204588.82313: checking for any_errors_fatal 46400 1727204588.82315: done checking for any_errors_fatal 46400 1727204588.82315: checking for max_fail_percentage 46400 1727204588.82316: done checking for max_fail_percentage 46400 1727204588.82317: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.82317: done checking to see if all hosts have failed 46400 1727204588.82318: getting the remaining hosts for this loop 46400 1727204588.82318: done getting the remaining hosts for this loop 46400 1727204588.82320: getting the next task for host managed-node2 46400 1727204588.82327: done getting next task for host managed-node2 46400 1727204588.82329: ^ task is: TASK: Show result 46400 1727204588.82332: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.82359: getting variables 46400 1727204588.82362: in VariableManager get_vars() 46400 1727204588.82391: Calling all_inventory to load vars for managed-node2 46400 1727204588.82393: Calling groups_inventory to load vars for managed-node2 46400 1727204588.82402: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.82408: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.82411: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.82414: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.83469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.85103: done with get_vars() 46400 1727204588.85124: done getting variables 46400 1727204588.85192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.410) 0:01:19.136 ***** 46400 1727204588.85236: entering _queue_task() for managed-node2/debug 46400 1727204588.85658: worker is 1 (out of 1 available) 46400 1727204588.85682: exiting _queue_task() for managed-node2/debug 46400 1727204588.85696: done queuing things up, now waiting for results queue to drain 46400 1727204588.85698: waiting for pending results... 46400 1727204588.86004: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204588.86116: in run() - task 0affcd87-79f5-1303-fda8-0000000017d1 46400 1727204588.86125: variable 'ansible_search_path' from source: unknown 46400 1727204588.86231: variable 'ansible_search_path' from source: unknown 46400 1727204588.86236: calling self._execute() 46400 1727204588.86286: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.86290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.86303: variable 'omit' from source: magic vars 46400 1727204588.86772: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.86776: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.86795: variable 'omit' from source: magic vars 46400 1727204588.86798: variable 'omit' from source: magic vars 46400 1727204588.86852: variable 'omit' from source: magic vars 46400 1727204588.86883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204588.86969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204588.87014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204588.87018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.87021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204588.87049: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204588.87052: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.87056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.87156: Set connection var ansible_shell_type to sh 46400 1727204588.87171: Set connection var ansible_shell_executable to /bin/sh 46400 1727204588.87176: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204588.87182: Set connection var ansible_connection to ssh 46400 1727204588.87187: Set connection var ansible_pipelining to False 46400 1727204588.87193: Set connection var ansible_timeout to 10 46400 1727204588.87221: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.87224: variable 'ansible_connection' from source: unknown 46400 1727204588.87227: variable 'ansible_module_compression' from source: unknown 46400 1727204588.87229: variable 'ansible_shell_type' from source: unknown 46400 1727204588.87232: variable 'ansible_shell_executable' from source: unknown 46400 1727204588.87236: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.87238: variable 'ansible_pipelining' from source: unknown 46400 1727204588.87241: variable 'ansible_timeout' from source: unknown 46400 1727204588.87243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.87393: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204588.87404: variable 'omit' from source: magic vars 46400 1727204588.87409: starting attempt loop 46400 1727204588.87412: running the handler 46400 1727204588.87494: variable '__network_connections_result' from source: set_fact 46400 1727204588.87569: variable '__network_connections_result' from source: set_fact 46400 1727204588.87708: handler run complete 46400 1727204588.87724: attempt loop complete, returning result 46400 1727204588.87727: _execute() done 46400 1727204588.87730: dumping result to json 46400 1727204588.87732: done dumping result, returning 46400 1727204588.87742: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-0000000017d1] 46400 1727204588.87745: sending task result for task 0affcd87-79f5-1303-fda8-0000000017d1 46400 1727204588.87848: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017d1 46400 1727204588.87851: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b" ] } } 46400 1727204588.87931: no more pending results, returning what we have 46400 1727204588.87936: results queue empty 46400 1727204588.87937: checking for any_errors_fatal 46400 1727204588.87940: done checking for any_errors_fatal 46400 1727204588.87940: checking for max_fail_percentage 46400 1727204588.87943: done checking for max_fail_percentage 46400 1727204588.87944: checking to see if all hosts have failed and the running result is not ok 46400 1727204588.87944: done checking to see if all hosts have failed 46400 1727204588.87945: getting the remaining hosts for this loop 46400 1727204588.87947: done getting the remaining hosts for this loop 46400 1727204588.87952: getting the next task for host managed-node2 46400 1727204588.87965: done getting next task for host managed-node2 46400 1727204588.87969: ^ task is: TASK: Include network role 46400 1727204588.87976: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204588.87982: getting variables 46400 1727204588.87984: in VariableManager get_vars() 46400 1727204588.88026: Calling all_inventory to load vars for managed-node2 46400 1727204588.88058: Calling groups_inventory to load vars for managed-node2 46400 1727204588.88063: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.88083: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.88087: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.88090: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.90029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204588.92568: done with get_vars() 46400 1727204588.92610: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:03:08 -0400 (0:00:00.075) 0:01:19.212 ***** 46400 1727204588.92744: entering _queue_task() for managed-node2/include_role 46400 1727204588.93300: worker is 1 (out of 1 available) 46400 1727204588.93312: exiting _queue_task() for managed-node2/include_role 46400 1727204588.93327: done queuing things up, now waiting for results queue to drain 46400 1727204588.93328: waiting for pending results... 46400 1727204588.93782: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204588.94345: in run() - task 0affcd87-79f5-1303-fda8-0000000017d5 46400 1727204588.94367: variable 'ansible_search_path' from source: unknown 46400 1727204588.94378: variable 'ansible_search_path' from source: unknown 46400 1727204588.94427: calling self._execute() 46400 1727204588.94578: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204588.94591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204588.94614: variable 'omit' from source: magic vars 46400 1727204588.96017: variable 'ansible_distribution_major_version' from source: facts 46400 1727204588.96051: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204588.96068: _execute() done 46400 1727204588.96072: dumping result to json 46400 1727204588.96075: done dumping result, returning 46400 1727204588.96077: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-0000000017d5] 46400 1727204588.96080: sending task result for task 0affcd87-79f5-1303-fda8-0000000017d5 46400 1727204588.96378: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017d5 46400 1727204588.96383: WORKER PROCESS EXITING 46400 1727204588.96409: no more pending results, returning what we have 46400 1727204588.96417: in VariableManager get_vars() 46400 1727204588.96476: Calling all_inventory to load vars for managed-node2 46400 1727204588.96479: Calling groups_inventory to load vars for managed-node2 46400 1727204588.96484: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204588.96496: Calling all_plugins_play to load vars for managed-node2 46400 1727204588.96499: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204588.96502: Calling groups_plugins_play to load vars for managed-node2 46400 1727204588.98550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.00615: done with get_vars() 46400 1727204589.00633: variable 'ansible_search_path' from source: unknown 46400 1727204589.00634: variable 'ansible_search_path' from source: unknown 46400 1727204589.00802: variable 'omit' from source: magic vars 46400 1727204589.00831: variable 'omit' from source: magic vars 46400 1727204589.00851: variable 'omit' from source: magic vars 46400 1727204589.00855: we have included files to process 46400 1727204589.00856: generating all_blocks data 46400 1727204589.00858: done generating all_blocks data 46400 1727204589.00867: processing included file: fedora.linux_system_roles.network 46400 1727204589.00882: in VariableManager get_vars() 46400 1727204589.00894: done with get_vars() 46400 1727204589.00914: in VariableManager get_vars() 46400 1727204589.00925: done with get_vars() 46400 1727204589.00966: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204589.01101: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204589.01193: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204589.01679: in VariableManager get_vars() 46400 1727204589.01694: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204589.03478: iterating over new_blocks loaded from include file 46400 1727204589.03480: in VariableManager get_vars() 46400 1727204589.03500: done with get_vars() 46400 1727204589.03502: filtering new block on tags 46400 1727204589.03740: done filtering new block on tags 46400 1727204589.03743: in VariableManager get_vars() 46400 1727204589.03753: done with get_vars() 46400 1727204589.03754: filtering new block on tags 46400 1727204589.03767: done filtering new block on tags 46400 1727204589.03769: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204589.03773: extending task lists for all hosts with included blocks 46400 1727204589.03856: done extending task lists 46400 1727204589.03857: done processing included files 46400 1727204589.03858: results queue empty 46400 1727204589.03858: checking for any_errors_fatal 46400 1727204589.03863: done checking for any_errors_fatal 46400 1727204589.03865: checking for max_fail_percentage 46400 1727204589.03866: done checking for max_fail_percentage 46400 1727204589.03867: checking to see if all hosts have failed and the running result is not ok 46400 1727204589.03867: done checking to see if all hosts have failed 46400 1727204589.03868: getting the remaining hosts for this loop 46400 1727204589.03868: done getting the remaining hosts for this loop 46400 1727204589.03870: getting the next task for host managed-node2 46400 1727204589.03873: done getting next task for host managed-node2 46400 1727204589.03875: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204589.03877: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204589.03884: getting variables 46400 1727204589.03885: in VariableManager get_vars() 46400 1727204589.03894: Calling all_inventory to load vars for managed-node2 46400 1727204589.03895: Calling groups_inventory to load vars for managed-node2 46400 1727204589.03897: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.03900: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.03902: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.03903: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.04820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.06069: done with get_vars() 46400 1727204589.06100: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.134) 0:01:19.346 ***** 46400 1727204589.06185: entering _queue_task() for managed-node2/include_tasks 46400 1727204589.06549: worker is 1 (out of 1 available) 46400 1727204589.06561: exiting _queue_task() for managed-node2/include_tasks 46400 1727204589.06589: done queuing things up, now waiting for results queue to drain 46400 1727204589.06591: waiting for pending results... 46400 1727204589.07003: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204589.07218: in run() - task 0affcd87-79f5-1303-fda8-0000000019bf 46400 1727204589.07231: variable 'ansible_search_path' from source: unknown 46400 1727204589.07234: variable 'ansible_search_path' from source: unknown 46400 1727204589.07290: calling self._execute() 46400 1727204589.07383: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.07389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.07399: variable 'omit' from source: magic vars 46400 1727204589.07838: variable 'ansible_distribution_major_version' from source: facts 46400 1727204589.07849: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204589.07855: _execute() done 46400 1727204589.07858: dumping result to json 46400 1727204589.07863: done dumping result, returning 46400 1727204589.07870: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-0000000019bf] 46400 1727204589.07877: sending task result for task 0affcd87-79f5-1303-fda8-0000000019bf 46400 1727204589.07998: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019bf 46400 1727204589.08002: WORKER PROCESS EXITING 46400 1727204589.08053: no more pending results, returning what we have 46400 1727204589.08058: in VariableManager get_vars() 46400 1727204589.08109: Calling all_inventory to load vars for managed-node2 46400 1727204589.08113: Calling groups_inventory to load vars for managed-node2 46400 1727204589.08115: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.08127: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.08131: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.08133: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.09421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.10384: done with get_vars() 46400 1727204589.10403: variable 'ansible_search_path' from source: unknown 46400 1727204589.10404: variable 'ansible_search_path' from source: unknown 46400 1727204589.10442: we have included files to process 46400 1727204589.10444: generating all_blocks data 46400 1727204589.10446: done generating all_blocks data 46400 1727204589.10448: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204589.10449: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204589.10451: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204589.10967: done processing included file 46400 1727204589.10969: iterating over new_blocks loaded from include file 46400 1727204589.10970: in VariableManager get_vars() 46400 1727204589.10993: done with get_vars() 46400 1727204589.10995: filtering new block on tags 46400 1727204589.11021: done filtering new block on tags 46400 1727204589.11024: in VariableManager get_vars() 46400 1727204589.11044: done with get_vars() 46400 1727204589.11045: filtering new block on tags 46400 1727204589.11085: done filtering new block on tags 46400 1727204589.11087: in VariableManager get_vars() 46400 1727204589.11107: done with get_vars() 46400 1727204589.11108: filtering new block on tags 46400 1727204589.11146: done filtering new block on tags 46400 1727204589.11147: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204589.11152: extending task lists for all hosts with included blocks 46400 1727204589.12574: done extending task lists 46400 1727204589.12575: done processing included files 46400 1727204589.12575: results queue empty 46400 1727204589.12576: checking for any_errors_fatal 46400 1727204589.12578: done checking for any_errors_fatal 46400 1727204589.12579: checking for max_fail_percentage 46400 1727204589.12579: done checking for max_fail_percentage 46400 1727204589.12580: checking to see if all hosts have failed and the running result is not ok 46400 1727204589.12581: done checking to see if all hosts have failed 46400 1727204589.12581: getting the remaining hosts for this loop 46400 1727204589.12582: done getting the remaining hosts for this loop 46400 1727204589.12584: getting the next task for host managed-node2 46400 1727204589.12587: done getting next task for host managed-node2 46400 1727204589.12589: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204589.12592: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204589.12600: getting variables 46400 1727204589.12600: in VariableManager get_vars() 46400 1727204589.12611: Calling all_inventory to load vars for managed-node2 46400 1727204589.12612: Calling groups_inventory to load vars for managed-node2 46400 1727204589.12613: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.12617: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.12619: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.12620: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.13357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.18467: done with get_vars() 46400 1727204589.18488: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.123) 0:01:19.470 ***** 46400 1727204589.18543: entering _queue_task() for managed-node2/setup 46400 1727204589.18796: worker is 1 (out of 1 available) 46400 1727204589.18810: exiting _queue_task() for managed-node2/setup 46400 1727204589.18824: done queuing things up, now waiting for results queue to drain 46400 1727204589.18827: waiting for pending results... 46400 1727204589.19020: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204589.19132: in run() - task 0affcd87-79f5-1303-fda8-000000001a16 46400 1727204589.19143: variable 'ansible_search_path' from source: unknown 46400 1727204589.19148: variable 'ansible_search_path' from source: unknown 46400 1727204589.19181: calling self._execute() 46400 1727204589.19258: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.19267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.19278: variable 'omit' from source: magic vars 46400 1727204589.19559: variable 'ansible_distribution_major_version' from source: facts 46400 1727204589.19573: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204589.19721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204589.21353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204589.21412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204589.21440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204589.21468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204589.21488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204589.21549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204589.21574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204589.21593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204589.21625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204589.21635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204589.21678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204589.21694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204589.21717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204589.21740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204589.21751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204589.21866: variable '__network_required_facts' from source: role '' defaults 46400 1727204589.21872: variable 'ansible_facts' from source: unknown 46400 1727204589.22411: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204589.22419: when evaluation is False, skipping this task 46400 1727204589.22426: _execute() done 46400 1727204589.22433: dumping result to json 46400 1727204589.22441: done dumping result, returning 46400 1727204589.22452: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000001a16] 46400 1727204589.22461: sending task result for task 0affcd87-79f5-1303-fda8-000000001a16 46400 1727204589.22570: done sending task result for task 0affcd87-79f5-1303-fda8-000000001a16 46400 1727204589.22578: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204589.22621: no more pending results, returning what we have 46400 1727204589.22626: results queue empty 46400 1727204589.22627: checking for any_errors_fatal 46400 1727204589.22628: done checking for any_errors_fatal 46400 1727204589.22629: checking for max_fail_percentage 46400 1727204589.22630: done checking for max_fail_percentage 46400 1727204589.22631: checking to see if all hosts have failed and the running result is not ok 46400 1727204589.22632: done checking to see if all hosts have failed 46400 1727204589.22633: getting the remaining hosts for this loop 46400 1727204589.22634: done getting the remaining hosts for this loop 46400 1727204589.22638: getting the next task for host managed-node2 46400 1727204589.22649: done getting next task for host managed-node2 46400 1727204589.22653: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204589.22659: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204589.22693: getting variables 46400 1727204589.22695: in VariableManager get_vars() 46400 1727204589.22737: Calling all_inventory to load vars for managed-node2 46400 1727204589.22740: Calling groups_inventory to load vars for managed-node2 46400 1727204589.22742: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.22752: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.22755: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.22772: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.24076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.25010: done with get_vars() 46400 1727204589.25031: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.065) 0:01:19.535 ***** 46400 1727204589.25115: entering _queue_task() for managed-node2/stat 46400 1727204589.25359: worker is 1 (out of 1 available) 46400 1727204589.25375: exiting _queue_task() for managed-node2/stat 46400 1727204589.25388: done queuing things up, now waiting for results queue to drain 46400 1727204589.25390: waiting for pending results... 46400 1727204589.25656: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204589.25839: in run() - task 0affcd87-79f5-1303-fda8-000000001a18 46400 1727204589.25860: variable 'ansible_search_path' from source: unknown 46400 1727204589.25872: variable 'ansible_search_path' from source: unknown 46400 1727204589.25912: calling self._execute() 46400 1727204589.26013: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.26026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.26041: variable 'omit' from source: magic vars 46400 1727204589.26426: variable 'ansible_distribution_major_version' from source: facts 46400 1727204589.26444: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204589.26623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204589.26907: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204589.26966: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204589.27046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204589.27086: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204589.27179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204589.27209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204589.27240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204589.27273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204589.27365: variable '__network_is_ostree' from source: set_fact 46400 1727204589.27381: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204589.27389: when evaluation is False, skipping this task 46400 1727204589.27395: _execute() done 46400 1727204589.27402: dumping result to json 46400 1727204589.27409: done dumping result, returning 46400 1727204589.27419: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000001a18] 46400 1727204589.27429: sending task result for task 0affcd87-79f5-1303-fda8-000000001a18 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204589.27568: no more pending results, returning what we have 46400 1727204589.27572: results queue empty 46400 1727204589.27574: checking for any_errors_fatal 46400 1727204589.27579: done checking for any_errors_fatal 46400 1727204589.27580: checking for max_fail_percentage 46400 1727204589.27582: done checking for max_fail_percentage 46400 1727204589.27583: checking to see if all hosts have failed and the running result is not ok 46400 1727204589.27584: done checking to see if all hosts have failed 46400 1727204589.27585: getting the remaining hosts for this loop 46400 1727204589.27587: done getting the remaining hosts for this loop 46400 1727204589.27591: getting the next task for host managed-node2 46400 1727204589.27602: done getting next task for host managed-node2 46400 1727204589.27606: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204589.27612: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204589.27641: getting variables 46400 1727204589.27643: in VariableManager get_vars() 46400 1727204589.27687: Calling all_inventory to load vars for managed-node2 46400 1727204589.27690: Calling groups_inventory to load vars for managed-node2 46400 1727204589.27692: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.27703: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.27705: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.27708: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.28413: done sending task result for task 0affcd87-79f5-1303-fda8-000000001a18 46400 1727204589.28418: WORKER PROCESS EXITING 46400 1727204589.29286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.30904: done with get_vars() 46400 1727204589.30932: done getting variables 46400 1727204589.30998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.059) 0:01:19.594 ***** 46400 1727204589.31041: entering _queue_task() for managed-node2/set_fact 46400 1727204589.31387: worker is 1 (out of 1 available) 46400 1727204589.31401: exiting _queue_task() for managed-node2/set_fact 46400 1727204589.31414: done queuing things up, now waiting for results queue to drain 46400 1727204589.31416: waiting for pending results... 46400 1727204589.31722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204589.31886: in run() - task 0affcd87-79f5-1303-fda8-000000001a19 46400 1727204589.31907: variable 'ansible_search_path' from source: unknown 46400 1727204589.31915: variable 'ansible_search_path' from source: unknown 46400 1727204589.31956: calling self._execute() 46400 1727204589.32062: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.32082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.32098: variable 'omit' from source: magic vars 46400 1727204589.32523: variable 'ansible_distribution_major_version' from source: facts 46400 1727204589.32533: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204589.32661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204589.32870: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204589.32906: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204589.32965: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204589.32994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204589.33059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204589.33082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204589.33101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204589.33119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204589.33188: variable '__network_is_ostree' from source: set_fact 46400 1727204589.33193: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204589.33197: when evaluation is False, skipping this task 46400 1727204589.33199: _execute() done 46400 1727204589.33202: dumping result to json 46400 1727204589.33205: done dumping result, returning 46400 1727204589.33211: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000001a19] 46400 1727204589.33217: sending task result for task 0affcd87-79f5-1303-fda8-000000001a19 46400 1727204589.33311: done sending task result for task 0affcd87-79f5-1303-fda8-000000001a19 46400 1727204589.33313: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204589.33355: no more pending results, returning what we have 46400 1727204589.33360: results queue empty 46400 1727204589.33363: checking for any_errors_fatal 46400 1727204589.33373: done checking for any_errors_fatal 46400 1727204589.33374: checking for max_fail_percentage 46400 1727204589.33375: done checking for max_fail_percentage 46400 1727204589.33376: checking to see if all hosts have failed and the running result is not ok 46400 1727204589.33377: done checking to see if all hosts have failed 46400 1727204589.33378: getting the remaining hosts for this loop 46400 1727204589.33380: done getting the remaining hosts for this loop 46400 1727204589.33384: getting the next task for host managed-node2 46400 1727204589.33396: done getting next task for host managed-node2 46400 1727204589.33400: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204589.33406: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204589.33430: getting variables 46400 1727204589.33432: in VariableManager get_vars() 46400 1727204589.33475: Calling all_inventory to load vars for managed-node2 46400 1727204589.33477: Calling groups_inventory to load vars for managed-node2 46400 1727204589.33480: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204589.33489: Calling all_plugins_play to load vars for managed-node2 46400 1727204589.33492: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204589.33494: Calling groups_plugins_play to load vars for managed-node2 46400 1727204589.34693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204589.37092: done with get_vars() 46400 1727204589.37123: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.061) 0:01:19.656 ***** 46400 1727204589.37224: entering _queue_task() for managed-node2/service_facts 46400 1727204589.37565: worker is 1 (out of 1 available) 46400 1727204589.37577: exiting _queue_task() for managed-node2/service_facts 46400 1727204589.37590: done queuing things up, now waiting for results queue to drain 46400 1727204589.37592: waiting for pending results... 46400 1727204589.39371: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204589.39689: in run() - task 0affcd87-79f5-1303-fda8-000000001a1b 46400 1727204589.39709: variable 'ansible_search_path' from source: unknown 46400 1727204589.39716: variable 'ansible_search_path' from source: unknown 46400 1727204589.39803: calling self._execute() 46400 1727204589.39915: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.39928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.39942: variable 'omit' from source: magic vars 46400 1727204589.40700: variable 'ansible_distribution_major_version' from source: facts 46400 1727204589.40713: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204589.40719: variable 'omit' from source: magic vars 46400 1727204589.40818: variable 'omit' from source: magic vars 46400 1727204589.40879: variable 'omit' from source: magic vars 46400 1727204589.40930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204589.40980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204589.41003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204589.41022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204589.41031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204589.41068: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204589.41072: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.41075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.41186: Set connection var ansible_shell_type to sh 46400 1727204589.41201: Set connection var ansible_shell_executable to /bin/sh 46400 1727204589.41207: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204589.41212: Set connection var ansible_connection to ssh 46400 1727204589.41218: Set connection var ansible_pipelining to False 46400 1727204589.41224: Set connection var ansible_timeout to 10 46400 1727204589.41290: variable 'ansible_shell_executable' from source: unknown 46400 1727204589.41293: variable 'ansible_connection' from source: unknown 46400 1727204589.41305: variable 'ansible_module_compression' from source: unknown 46400 1727204589.41309: variable 'ansible_shell_type' from source: unknown 46400 1727204589.41313: variable 'ansible_shell_executable' from source: unknown 46400 1727204589.42174: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204589.42179: variable 'ansible_pipelining' from source: unknown 46400 1727204589.42182: variable 'ansible_timeout' from source: unknown 46400 1727204589.42185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204589.43371: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204589.43376: variable 'omit' from source: magic vars 46400 1727204589.43379: starting attempt loop 46400 1727204589.43381: running the handler 46400 1727204589.43383: _low_level_execute_command(): starting 46400 1727204589.43385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204589.45446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.45456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.45686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.45690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.45908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.45918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.46076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204589.46122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204589.46231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204589.47878: stdout chunk (state=3): >>>/root <<< 46400 1727204589.48053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204589.48098: stderr chunk (state=3): >>><<< 46400 1727204589.48106: stdout chunk (state=3): >>><<< 46400 1727204589.48133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204589.48146: _low_level_execute_command(): starting 46400 1727204589.48169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268 `" && echo ansible-tmp-1727204589.4813054-51982-113761677851268="` echo /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268 `" ) && sleep 0' 46400 1727204589.49865: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204589.49885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.49902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.49923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.49975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.49993: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204589.50006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.50022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204589.50079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204589.50095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204589.50110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.50126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.50143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.50157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.50177: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204589.50193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.50276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204589.50495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204589.50514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204589.50593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204589.52467: stdout chunk (state=3): >>>ansible-tmp-1727204589.4813054-51982-113761677851268=/root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268 <<< 46400 1727204589.52682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204589.52687: stdout chunk (state=3): >>><<< 46400 1727204589.52689: stderr chunk (state=3): >>><<< 46400 1727204589.52771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204589.4813054-51982-113761677851268=/root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204589.52777: variable 'ansible_module_compression' from source: unknown 46400 1727204589.52894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204589.52898: variable 'ansible_facts' from source: unknown 46400 1727204589.52970: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/AnsiballZ_service_facts.py 46400 1727204589.53593: Sending initial data 46400 1727204589.53597: Sent initial data (162 bytes) 46400 1727204589.55285: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204589.55308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.55324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.55342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.55395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.55408: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204589.55427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.55446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204589.55458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204589.55479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204589.55493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.55508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.55529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.55543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.55554: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204589.55710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.55800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204589.55824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204589.55841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204589.55918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204589.57628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204589.57671: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204589.57708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmprcww1cy3 /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/AnsiballZ_service_facts.py <<< 46400 1727204589.57749: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204589.59178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204589.59254: stderr chunk (state=3): >>><<< 46400 1727204589.59258: stdout chunk (state=3): >>><<< 46400 1727204589.59284: done transferring module to remote 46400 1727204589.59297: _low_level_execute_command(): starting 46400 1727204589.59300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/ /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/AnsiballZ_service_facts.py && sleep 0' 46400 1727204589.61278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204589.61298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.61314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.61336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.61479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.61491: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204589.61505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.61522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204589.61553: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204589.61575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204589.61588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.61601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.61689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.61703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.61714: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204589.61728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.61808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204589.61905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204589.61920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204589.62119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204589.63902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204589.63906: stdout chunk (state=3): >>><<< 46400 1727204589.63908: stderr chunk (state=3): >>><<< 46400 1727204589.63971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204589.63975: _low_level_execute_command(): starting 46400 1727204589.63977: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/AnsiballZ_service_facts.py && sleep 0' 46400 1727204589.65418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204589.65661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.65678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.65830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.65874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.65881: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204589.65892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.65911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204589.65919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204589.65926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204589.65936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204589.65942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204589.65954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204589.66025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204589.66032: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204589.66045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204589.66123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204589.66289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204589.66302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204589.66389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204591.95508: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204591.95558: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 46400 1727204591.95578: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204591.96897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204591.96901: stdout chunk (state=3): >>><<< 46400 1727204591.96903: stderr chunk (state=3): >>><<< 46400 1727204591.96973: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204591.98001: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204591.98017: _low_level_execute_command(): starting 46400 1727204591.98026: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204589.4813054-51982-113761677851268/ > /dev/null 2>&1 && sleep 0' 46400 1727204591.98640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204591.98656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204591.98677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204591.98694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204591.98736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204591.98749: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204591.98778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204591.98800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204591.98835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204591.98848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204591.98906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204591.98923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204591.98926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204591.98981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.00762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.00825: stderr chunk (state=3): >>><<< 46400 1727204592.00829: stdout chunk (state=3): >>><<< 46400 1727204592.00871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204592.00875: handler run complete 46400 1727204592.01077: variable 'ansible_facts' from source: unknown 46400 1727204592.01209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.01706: variable 'ansible_facts' from source: unknown 46400 1727204592.01798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.01910: attempt loop complete, returning result 46400 1727204592.01913: _execute() done 46400 1727204592.01916: dumping result to json 46400 1727204592.01957: done dumping result, returning 46400 1727204592.01965: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000001a1b] 46400 1727204592.01976: sending task result for task 0affcd87-79f5-1303-fda8-000000001a1b 46400 1727204592.02724: done sending task result for task 0affcd87-79f5-1303-fda8-000000001a1b 46400 1727204592.02727: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204592.02781: no more pending results, returning what we have 46400 1727204592.02783: results queue empty 46400 1727204592.02784: checking for any_errors_fatal 46400 1727204592.02787: done checking for any_errors_fatal 46400 1727204592.02787: checking for max_fail_percentage 46400 1727204592.02788: done checking for max_fail_percentage 46400 1727204592.02789: checking to see if all hosts have failed and the running result is not ok 46400 1727204592.02789: done checking to see if all hosts have failed 46400 1727204592.02790: getting the remaining hosts for this loop 46400 1727204592.02791: done getting the remaining hosts for this loop 46400 1727204592.02793: getting the next task for host managed-node2 46400 1727204592.02798: done getting next task for host managed-node2 46400 1727204592.02801: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204592.02808: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204592.02816: getting variables 46400 1727204592.02817: in VariableManager get_vars() 46400 1727204592.02840: Calling all_inventory to load vars for managed-node2 46400 1727204592.02842: Calling groups_inventory to load vars for managed-node2 46400 1727204592.02848: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204592.02855: Calling all_plugins_play to load vars for managed-node2 46400 1727204592.02856: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204592.02858: Calling groups_plugins_play to load vars for managed-node2 46400 1727204592.03645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.05422: done with get_vars() 46400 1727204592.05460: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:12 -0400 (0:00:02.683) 0:01:22.340 ***** 46400 1727204592.05576: entering _queue_task() for managed-node2/package_facts 46400 1727204592.05955: worker is 1 (out of 1 available) 46400 1727204592.05971: exiting _queue_task() for managed-node2/package_facts 46400 1727204592.05985: done queuing things up, now waiting for results queue to drain 46400 1727204592.05987: waiting for pending results... 46400 1727204592.06305: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204592.06480: in run() - task 0affcd87-79f5-1303-fda8-000000001a1c 46400 1727204592.06495: variable 'ansible_search_path' from source: unknown 46400 1727204592.06499: variable 'ansible_search_path' from source: unknown 46400 1727204592.06534: calling self._execute() 46400 1727204592.06643: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204592.06656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204592.06670: variable 'omit' from source: magic vars 46400 1727204592.07076: variable 'ansible_distribution_major_version' from source: facts 46400 1727204592.07097: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204592.07103: variable 'omit' from source: magic vars 46400 1727204592.07186: variable 'omit' from source: magic vars 46400 1727204592.07229: variable 'omit' from source: magic vars 46400 1727204592.07274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204592.07318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204592.07341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204592.07358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204592.07371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204592.07399: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204592.07403: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204592.07406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204592.07511: Set connection var ansible_shell_type to sh 46400 1727204592.07520: Set connection var ansible_shell_executable to /bin/sh 46400 1727204592.07536: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204592.07541: Set connection var ansible_connection to ssh 46400 1727204592.07547: Set connection var ansible_pipelining to False 46400 1727204592.07553: Set connection var ansible_timeout to 10 46400 1727204592.07580: variable 'ansible_shell_executable' from source: unknown 46400 1727204592.07583: variable 'ansible_connection' from source: unknown 46400 1727204592.07586: variable 'ansible_module_compression' from source: unknown 46400 1727204592.07589: variable 'ansible_shell_type' from source: unknown 46400 1727204592.07591: variable 'ansible_shell_executable' from source: unknown 46400 1727204592.07593: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204592.07595: variable 'ansible_pipelining' from source: unknown 46400 1727204592.07598: variable 'ansible_timeout' from source: unknown 46400 1727204592.07602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204592.07819: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204592.07829: variable 'omit' from source: magic vars 46400 1727204592.07834: starting attempt loop 46400 1727204592.07837: running the handler 46400 1727204592.07852: _low_level_execute_command(): starting 46400 1727204592.07870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204592.08672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.08686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.08697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.08713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.08766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.08770: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.08782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.08796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.08805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.08812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.08821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.08831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.08845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.08863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.08870: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.08880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.08952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.08983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.08996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.09065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.10605: stdout chunk (state=3): >>>/root <<< 46400 1727204592.10709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.10808: stderr chunk (state=3): >>><<< 46400 1727204592.10823: stdout chunk (state=3): >>><<< 46400 1727204592.10971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204592.10975: _low_level_execute_command(): starting 46400 1727204592.10978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468 `" && echo ansible-tmp-1727204592.1086087-52329-22846185344468="` echo /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468 `" ) && sleep 0' 46400 1727204592.11590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.11614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.11631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.11650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.11697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.11709: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.11724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.11741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.11753: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.11770: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.11782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.11795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.11811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.11823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.11834: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.11847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.11924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.11946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.11966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.12040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.13869: stdout chunk (state=3): >>>ansible-tmp-1727204592.1086087-52329-22846185344468=/root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468 <<< 46400 1727204592.14083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.14087: stdout chunk (state=3): >>><<< 46400 1727204592.14089: stderr chunk (state=3): >>><<< 46400 1727204592.14178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204592.1086087-52329-22846185344468=/root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204592.14183: variable 'ansible_module_compression' from source: unknown 46400 1727204592.14377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204592.14380: variable 'ansible_facts' from source: unknown 46400 1727204592.14509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/AnsiballZ_package_facts.py 46400 1727204592.14694: Sending initial data 46400 1727204592.14697: Sent initial data (161 bytes) 46400 1727204592.16067: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.16094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.16185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.16213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.16255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.16274: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.16292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.16318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.16330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.16341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.16352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.16370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.16387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.16402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.16413: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.16436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.16521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.16553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.16577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.16653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.18342: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204592.18371: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204592.18409: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpco4xof1v /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/AnsiballZ_package_facts.py <<< 46400 1727204592.18449: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204592.21430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.21569: stderr chunk (state=3): >>><<< 46400 1727204592.21573: stdout chunk (state=3): >>><<< 46400 1727204592.21575: done transferring module to remote 46400 1727204592.21581: _low_level_execute_command(): starting 46400 1727204592.21584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/ /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/AnsiballZ_package_facts.py && sleep 0' 46400 1727204592.22647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.22658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.22672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.22692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.22737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.22799: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.22815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.22829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.22838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.22846: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.22854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.22865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.22878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.22886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.22891: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.22906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.22986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.23128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.23141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.23232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.25120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.25183: stderr chunk (state=3): >>><<< 46400 1727204592.25186: stdout chunk (state=3): >>><<< 46400 1727204592.25205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204592.25208: _low_level_execute_command(): starting 46400 1727204592.25213: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/AnsiballZ_package_facts.py && sleep 0' 46400 1727204592.26392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.26524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.26540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.26555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.26598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.26767: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.26770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.26773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.26775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.26777: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.26779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.27020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.27063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.27073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.27077: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.27080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.27083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.27085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.27087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.27089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.74007: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204592.74110: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204592.74118: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204592.74123: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204592.74126: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 46400 1727204592.74129: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204592.74132: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204592.74135: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204592.74137: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204592.74141: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204592.74143: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204592.74146: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204592.74149: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204592.75798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204592.75803: stdout chunk (state=3): >>><<< 46400 1727204592.75807: stderr chunk (state=3): >>><<< 46400 1727204592.75981: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204592.81197: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204592.81313: _low_level_execute_command(): starting 46400 1727204592.81324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204592.1086087-52329-22846185344468/ > /dev/null 2>&1 && sleep 0' 46400 1727204592.83216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204592.83273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.83373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.83395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.83443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.83466: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204592.83485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.83505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204592.83519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204592.83531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204592.83544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204592.83561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204592.83583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204592.83603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204592.83615: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204592.83629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204592.83761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204592.83833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204592.83852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204592.83941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204592.85920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204592.85924: stdout chunk (state=3): >>><<< 46400 1727204592.85926: stderr chunk (state=3): >>><<< 46400 1727204592.86173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204592.86177: handler run complete 46400 1727204592.87478: variable 'ansible_facts' from source: unknown 46400 1727204592.88127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.91919: variable 'ansible_facts' from source: unknown 46400 1727204592.92539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.93461: attempt loop complete, returning result 46400 1727204592.93490: _execute() done 46400 1727204592.93499: dumping result to json 46400 1727204592.93733: done dumping result, returning 46400 1727204592.93749: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000001a1c] 46400 1727204592.93760: sending task result for task 0affcd87-79f5-1303-fda8-000000001a1c ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204592.96824: no more pending results, returning what we have 46400 1727204592.96828: results queue empty 46400 1727204592.96829: checking for any_errors_fatal 46400 1727204592.96833: done checking for any_errors_fatal 46400 1727204592.96834: checking for max_fail_percentage 46400 1727204592.96836: done checking for max_fail_percentage 46400 1727204592.96837: checking to see if all hosts have failed and the running result is not ok 46400 1727204592.96838: done checking to see if all hosts have failed 46400 1727204592.96839: getting the remaining hosts for this loop 46400 1727204592.96841: done getting the remaining hosts for this loop 46400 1727204592.96844: getting the next task for host managed-node2 46400 1727204592.96852: done getting next task for host managed-node2 46400 1727204592.96856: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204592.96867: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204592.96882: getting variables 46400 1727204592.96884: in VariableManager get_vars() 46400 1727204592.96920: Calling all_inventory to load vars for managed-node2 46400 1727204592.96923: Calling groups_inventory to load vars for managed-node2 46400 1727204592.96926: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204592.96936: Calling all_plugins_play to load vars for managed-node2 46400 1727204592.96943: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204592.96946: Calling groups_plugins_play to load vars for managed-node2 46400 1727204592.97882: done sending task result for task 0affcd87-79f5-1303-fda8-000000001a1c 46400 1727204592.97886: WORKER PROCESS EXITING 46400 1727204592.97947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204592.98902: done with get_vars() 46400 1727204592.98923: done getting variables 46400 1727204592.98976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.934) 0:01:23.274 ***** 46400 1727204592.99003: entering _queue_task() for managed-node2/debug 46400 1727204592.99257: worker is 1 (out of 1 available) 46400 1727204592.99277: exiting _queue_task() for managed-node2/debug 46400 1727204592.99291: done queuing things up, now waiting for results queue to drain 46400 1727204592.99293: waiting for pending results... 46400 1727204592.99495: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204592.99594: in run() - task 0affcd87-79f5-1303-fda8-0000000019c0 46400 1727204592.99604: variable 'ansible_search_path' from source: unknown 46400 1727204592.99610: variable 'ansible_search_path' from source: unknown 46400 1727204592.99640: calling self._execute() 46400 1727204592.99722: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204592.99726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204592.99736: variable 'omit' from source: magic vars 46400 1727204593.00132: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.00149: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.00162: variable 'omit' from source: magic vars 46400 1727204593.00245: variable 'omit' from source: magic vars 46400 1727204593.00357: variable 'network_provider' from source: set_fact 46400 1727204593.00386: variable 'omit' from source: magic vars 46400 1727204593.00433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204593.00479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204593.00508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204593.00531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204593.00546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204593.00590: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204593.00605: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.00615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.00712: Set connection var ansible_shell_type to sh 46400 1727204593.00731: Set connection var ansible_shell_executable to /bin/sh 46400 1727204593.00745: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204593.00758: Set connection var ansible_connection to ssh 46400 1727204593.00772: Set connection var ansible_pipelining to False 46400 1727204593.00782: Set connection var ansible_timeout to 10 46400 1727204593.00809: variable 'ansible_shell_executable' from source: unknown 46400 1727204593.00816: variable 'ansible_connection' from source: unknown 46400 1727204593.00822: variable 'ansible_module_compression' from source: unknown 46400 1727204593.00828: variable 'ansible_shell_type' from source: unknown 46400 1727204593.00841: variable 'ansible_shell_executable' from source: unknown 46400 1727204593.00848: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.00861: variable 'ansible_pipelining' from source: unknown 46400 1727204593.00872: variable 'ansible_timeout' from source: unknown 46400 1727204593.00880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.01020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204593.01047: variable 'omit' from source: magic vars 46400 1727204593.01052: starting attempt loop 46400 1727204593.01055: running the handler 46400 1727204593.01096: handler run complete 46400 1727204593.01117: attempt loop complete, returning result 46400 1727204593.01124: _execute() done 46400 1727204593.01130: dumping result to json 46400 1727204593.01138: done dumping result, returning 46400 1727204593.01156: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-0000000019c0] 46400 1727204593.01168: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c0 46400 1727204593.01277: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c0 46400 1727204593.01284: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204593.01446: no more pending results, returning what we have 46400 1727204593.01451: results queue empty 46400 1727204593.01452: checking for any_errors_fatal 46400 1727204593.01470: done checking for any_errors_fatal 46400 1727204593.01471: checking for max_fail_percentage 46400 1727204593.01473: done checking for max_fail_percentage 46400 1727204593.01474: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.01475: done checking to see if all hosts have failed 46400 1727204593.01476: getting the remaining hosts for this loop 46400 1727204593.01478: done getting the remaining hosts for this loop 46400 1727204593.01482: getting the next task for host managed-node2 46400 1727204593.01490: done getting next task for host managed-node2 46400 1727204593.01495: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204593.01500: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.01513: getting variables 46400 1727204593.01515: in VariableManager get_vars() 46400 1727204593.01553: Calling all_inventory to load vars for managed-node2 46400 1727204593.01556: Calling groups_inventory to load vars for managed-node2 46400 1727204593.01560: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.01572: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.01575: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.01578: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.02922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.03877: done with get_vars() 46400 1727204593.03902: done getting variables 46400 1727204593.03948: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.049) 0:01:23.324 ***** 46400 1727204593.03989: entering _queue_task() for managed-node2/fail 46400 1727204593.04244: worker is 1 (out of 1 available) 46400 1727204593.04259: exiting _queue_task() for managed-node2/fail 46400 1727204593.04274: done queuing things up, now waiting for results queue to drain 46400 1727204593.04276: waiting for pending results... 46400 1727204593.04484: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204593.04594: in run() - task 0affcd87-79f5-1303-fda8-0000000019c1 46400 1727204593.04606: variable 'ansible_search_path' from source: unknown 46400 1727204593.04610: variable 'ansible_search_path' from source: unknown 46400 1727204593.04644: calling self._execute() 46400 1727204593.04726: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.04730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.04735: variable 'omit' from source: magic vars 46400 1727204593.05108: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.05122: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.05243: variable 'network_state' from source: role '' defaults 46400 1727204593.05255: Evaluated conditional (network_state != {}): False 46400 1727204593.05261: when evaluation is False, skipping this task 46400 1727204593.05266: _execute() done 46400 1727204593.05269: dumping result to json 46400 1727204593.05272: done dumping result, returning 46400 1727204593.05275: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-0000000019c1] 46400 1727204593.05287: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c1 46400 1727204593.05391: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c1 46400 1727204593.05394: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204593.05441: no more pending results, returning what we have 46400 1727204593.05446: results queue empty 46400 1727204593.05447: checking for any_errors_fatal 46400 1727204593.05455: done checking for any_errors_fatal 46400 1727204593.05456: checking for max_fail_percentage 46400 1727204593.05458: done checking for max_fail_percentage 46400 1727204593.05461: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.05462: done checking to see if all hosts have failed 46400 1727204593.05462: getting the remaining hosts for this loop 46400 1727204593.05466: done getting the remaining hosts for this loop 46400 1727204593.05471: getting the next task for host managed-node2 46400 1727204593.05479: done getting next task for host managed-node2 46400 1727204593.05484: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204593.05489: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.05516: getting variables 46400 1727204593.05518: in VariableManager get_vars() 46400 1727204593.05556: Calling all_inventory to load vars for managed-node2 46400 1727204593.05561: Calling groups_inventory to load vars for managed-node2 46400 1727204593.05563: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.05584: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.05587: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.05589: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.06746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.07707: done with get_vars() 46400 1727204593.07727: done getting variables 46400 1727204593.07777: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.038) 0:01:23.362 ***** 46400 1727204593.07804: entering _queue_task() for managed-node2/fail 46400 1727204593.08052: worker is 1 (out of 1 available) 46400 1727204593.08067: exiting _queue_task() for managed-node2/fail 46400 1727204593.08081: done queuing things up, now waiting for results queue to drain 46400 1727204593.08083: waiting for pending results... 46400 1727204593.08288: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204593.08397: in run() - task 0affcd87-79f5-1303-fda8-0000000019c2 46400 1727204593.08408: variable 'ansible_search_path' from source: unknown 46400 1727204593.08416: variable 'ansible_search_path' from source: unknown 46400 1727204593.08448: calling self._execute() 46400 1727204593.08535: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.08540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.08549: variable 'omit' from source: magic vars 46400 1727204593.08850: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.08860: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.08953: variable 'network_state' from source: role '' defaults 46400 1727204593.08957: Evaluated conditional (network_state != {}): False 46400 1727204593.08961: when evaluation is False, skipping this task 46400 1727204593.08964: _execute() done 46400 1727204593.08969: dumping result to json 46400 1727204593.08974: done dumping result, returning 46400 1727204593.08979: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-0000000019c2] 46400 1727204593.08988: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c2 46400 1727204593.09082: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c2 46400 1727204593.09085: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204593.09133: no more pending results, returning what we have 46400 1727204593.09137: results queue empty 46400 1727204593.09139: checking for any_errors_fatal 46400 1727204593.09146: done checking for any_errors_fatal 46400 1727204593.09147: checking for max_fail_percentage 46400 1727204593.09149: done checking for max_fail_percentage 46400 1727204593.09150: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.09151: done checking to see if all hosts have failed 46400 1727204593.09151: getting the remaining hosts for this loop 46400 1727204593.09153: done getting the remaining hosts for this loop 46400 1727204593.09157: getting the next task for host managed-node2 46400 1727204593.09171: done getting next task for host managed-node2 46400 1727204593.09179: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204593.09185: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.09216: getting variables 46400 1727204593.09218: in VariableManager get_vars() 46400 1727204593.09257: Calling all_inventory to load vars for managed-node2 46400 1727204593.09260: Calling groups_inventory to load vars for managed-node2 46400 1727204593.09262: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.09278: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.09282: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.09290: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.10231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.11179: done with get_vars() 46400 1727204593.11199: done getting variables 46400 1727204593.11248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.034) 0:01:23.397 ***** 46400 1727204593.11278: entering _queue_task() for managed-node2/fail 46400 1727204593.11533: worker is 1 (out of 1 available) 46400 1727204593.11546: exiting _queue_task() for managed-node2/fail 46400 1727204593.11559: done queuing things up, now waiting for results queue to drain 46400 1727204593.11560: waiting for pending results... 46400 1727204593.11769: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204593.11874: in run() - task 0affcd87-79f5-1303-fda8-0000000019c3 46400 1727204593.11882: variable 'ansible_search_path' from source: unknown 46400 1727204593.11886: variable 'ansible_search_path' from source: unknown 46400 1727204593.11923: calling self._execute() 46400 1727204593.12006: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.12011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.12021: variable 'omit' from source: magic vars 46400 1727204593.12317: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.12324: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.12468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.14161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.14211: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.14239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.14267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.14290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.14355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.14396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.14421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.14450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.14462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.14546: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.14560: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204593.14567: when evaluation is False, skipping this task 46400 1727204593.14570: _execute() done 46400 1727204593.14572: dumping result to json 46400 1727204593.14576: done dumping result, returning 46400 1727204593.14584: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-0000000019c3] 46400 1727204593.14589: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c3 46400 1727204593.14687: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c3 46400 1727204593.14690: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204593.14743: no more pending results, returning what we have 46400 1727204593.14747: results queue empty 46400 1727204593.14748: checking for any_errors_fatal 46400 1727204593.14756: done checking for any_errors_fatal 46400 1727204593.14756: checking for max_fail_percentage 46400 1727204593.14758: done checking for max_fail_percentage 46400 1727204593.14759: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.14760: done checking to see if all hosts have failed 46400 1727204593.14761: getting the remaining hosts for this loop 46400 1727204593.14763: done getting the remaining hosts for this loop 46400 1727204593.14769: getting the next task for host managed-node2 46400 1727204593.14778: done getting next task for host managed-node2 46400 1727204593.14782: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204593.14788: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.14813: getting variables 46400 1727204593.14815: in VariableManager get_vars() 46400 1727204593.14870: Calling all_inventory to load vars for managed-node2 46400 1727204593.14873: Calling groups_inventory to load vars for managed-node2 46400 1727204593.14875: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.14885: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.14887: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.14890: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.15820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.17115: done with get_vars() 46400 1727204593.17143: done getting variables 46400 1727204593.17215: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.059) 0:01:23.457 ***** 46400 1727204593.17252: entering _queue_task() for managed-node2/dnf 46400 1727204593.17612: worker is 1 (out of 1 available) 46400 1727204593.17626: exiting _queue_task() for managed-node2/dnf 46400 1727204593.17640: done queuing things up, now waiting for results queue to drain 46400 1727204593.17642: waiting for pending results... 46400 1727204593.17962: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204593.18110: in run() - task 0affcd87-79f5-1303-fda8-0000000019c4 46400 1727204593.18129: variable 'ansible_search_path' from source: unknown 46400 1727204593.18133: variable 'ansible_search_path' from source: unknown 46400 1727204593.18162: calling self._execute() 46400 1727204593.18250: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.18254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.18265: variable 'omit' from source: magic vars 46400 1727204593.18569: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.18580: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.18727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.20415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.20465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.20497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.20522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.20541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.20699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.20703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.20705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.20782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.20785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.21071: variable 'ansible_distribution' from source: facts 46400 1727204593.21076: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.21078: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204593.21081: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.21374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.21377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.21380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.21382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.21385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.21387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.21389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.21391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.21393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.21395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.21469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.21472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.21477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.21480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.21482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.21623: variable 'network_connections' from source: include params 46400 1727204593.21639: variable 'interface' from source: play vars 46400 1727204593.21701: variable 'interface' from source: play vars 46400 1727204593.21780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204593.21938: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204593.21979: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204593.22013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204593.22047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204593.22097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204593.22125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204593.22166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.22197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204593.22248: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204593.22480: variable 'network_connections' from source: include params 46400 1727204593.22491: variable 'interface' from source: play vars 46400 1727204593.22559: variable 'interface' from source: play vars 46400 1727204593.22593: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204593.22602: when evaluation is False, skipping this task 46400 1727204593.22608: _execute() done 46400 1727204593.22615: dumping result to json 46400 1727204593.22621: done dumping result, returning 46400 1727204593.22632: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000019c4] 46400 1727204593.22641: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c4 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204593.22811: no more pending results, returning what we have 46400 1727204593.22816: results queue empty 46400 1727204593.22817: checking for any_errors_fatal 46400 1727204593.22823: done checking for any_errors_fatal 46400 1727204593.22824: checking for max_fail_percentage 46400 1727204593.22826: done checking for max_fail_percentage 46400 1727204593.22827: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.22828: done checking to see if all hosts have failed 46400 1727204593.22828: getting the remaining hosts for this loop 46400 1727204593.22830: done getting the remaining hosts for this loop 46400 1727204593.22834: getting the next task for host managed-node2 46400 1727204593.22844: done getting next task for host managed-node2 46400 1727204593.22848: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204593.22853: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.22868: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c4 46400 1727204593.22871: WORKER PROCESS EXITING 46400 1727204593.22900: getting variables 46400 1727204593.22902: in VariableManager get_vars() 46400 1727204593.22946: Calling all_inventory to load vars for managed-node2 46400 1727204593.22949: Calling groups_inventory to load vars for managed-node2 46400 1727204593.22951: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.22966: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.22969: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.22972: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.24605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.26613: done with get_vars() 46400 1727204593.26650: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204593.26735: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.095) 0:01:23.552 ***** 46400 1727204593.26774: entering _queue_task() for managed-node2/yum 46400 1727204593.27133: worker is 1 (out of 1 available) 46400 1727204593.27146: exiting _queue_task() for managed-node2/yum 46400 1727204593.27160: done queuing things up, now waiting for results queue to drain 46400 1727204593.27162: waiting for pending results... 46400 1727204593.27473: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204593.27643: in run() - task 0affcd87-79f5-1303-fda8-0000000019c5 46400 1727204593.27667: variable 'ansible_search_path' from source: unknown 46400 1727204593.27677: variable 'ansible_search_path' from source: unknown 46400 1727204593.27725: calling self._execute() 46400 1727204593.27837: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.27848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.27867: variable 'omit' from source: magic vars 46400 1727204593.28266: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.28284: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.28472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.30901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.30985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.31029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.31072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.31108: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.31198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.31638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.31673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.31721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.31746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.31853: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.31877: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204593.31885: when evaluation is False, skipping this task 46400 1727204593.31892: _execute() done 46400 1727204593.31899: dumping result to json 46400 1727204593.31906: done dumping result, returning 46400 1727204593.31918: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000019c5] 46400 1727204593.31929: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c5 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204593.32106: no more pending results, returning what we have 46400 1727204593.32111: results queue empty 46400 1727204593.32113: checking for any_errors_fatal 46400 1727204593.32122: done checking for any_errors_fatal 46400 1727204593.32123: checking for max_fail_percentage 46400 1727204593.32125: done checking for max_fail_percentage 46400 1727204593.32126: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.32127: done checking to see if all hosts have failed 46400 1727204593.32128: getting the remaining hosts for this loop 46400 1727204593.32130: done getting the remaining hosts for this loop 46400 1727204593.32134: getting the next task for host managed-node2 46400 1727204593.32143: done getting next task for host managed-node2 46400 1727204593.32148: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204593.32154: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.32183: getting variables 46400 1727204593.32185: in VariableManager get_vars() 46400 1727204593.32232: Calling all_inventory to load vars for managed-node2 46400 1727204593.32235: Calling groups_inventory to load vars for managed-node2 46400 1727204593.32238: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.32250: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.32253: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.32256: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.33322: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c5 46400 1727204593.33326: WORKER PROCESS EXITING 46400 1727204593.34217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.35838: done with get_vars() 46400 1727204593.35870: done getting variables 46400 1727204593.35935: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.092) 0:01:23.644 ***** 46400 1727204593.35978: entering _queue_task() for managed-node2/fail 46400 1727204593.36338: worker is 1 (out of 1 available) 46400 1727204593.36351: exiting _queue_task() for managed-node2/fail 46400 1727204593.36368: done queuing things up, now waiting for results queue to drain 46400 1727204593.36370: waiting for pending results... 46400 1727204593.36681: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204593.36840: in run() - task 0affcd87-79f5-1303-fda8-0000000019c6 46400 1727204593.36860: variable 'ansible_search_path' from source: unknown 46400 1727204593.36873: variable 'ansible_search_path' from source: unknown 46400 1727204593.36917: calling self._execute() 46400 1727204593.37030: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.37045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.37060: variable 'omit' from source: magic vars 46400 1727204593.37451: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.37474: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.37608: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.37820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.39867: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.39924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.39953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.39991: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.40013: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.40078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.40115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.40134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.40163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.40174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.40213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.40229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.40246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.40275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.40286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.40318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.40333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.40349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.40379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.40390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.40513: variable 'network_connections' from source: include params 46400 1727204593.40525: variable 'interface' from source: play vars 46400 1727204593.40582: variable 'interface' from source: play vars 46400 1727204593.40639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204593.40753: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204593.40783: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204593.40804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204593.40826: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204593.40865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204593.40880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204593.40897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.40914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204593.40952: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204593.41118: variable 'network_connections' from source: include params 46400 1727204593.41121: variable 'interface' from source: play vars 46400 1727204593.41168: variable 'interface' from source: play vars 46400 1727204593.41190: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204593.41194: when evaluation is False, skipping this task 46400 1727204593.41196: _execute() done 46400 1727204593.41199: dumping result to json 46400 1727204593.41201: done dumping result, returning 46400 1727204593.41207: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000019c6] 46400 1727204593.41213: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c6 46400 1727204593.41311: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c6 46400 1727204593.41314: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204593.41408: no more pending results, returning what we have 46400 1727204593.41413: results queue empty 46400 1727204593.41414: checking for any_errors_fatal 46400 1727204593.41425: done checking for any_errors_fatal 46400 1727204593.41425: checking for max_fail_percentage 46400 1727204593.41427: done checking for max_fail_percentage 46400 1727204593.41428: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.41429: done checking to see if all hosts have failed 46400 1727204593.41430: getting the remaining hosts for this loop 46400 1727204593.41432: done getting the remaining hosts for this loop 46400 1727204593.41436: getting the next task for host managed-node2 46400 1727204593.41444: done getting next task for host managed-node2 46400 1727204593.41448: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204593.41454: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.41520: getting variables 46400 1727204593.41522: in VariableManager get_vars() 46400 1727204593.41565: Calling all_inventory to load vars for managed-node2 46400 1727204593.41568: Calling groups_inventory to load vars for managed-node2 46400 1727204593.41570: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.41580: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.41583: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.41585: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.43008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.43965: done with get_vars() 46400 1727204593.43986: done getting variables 46400 1727204593.44035: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.080) 0:01:23.725 ***** 46400 1727204593.44066: entering _queue_task() for managed-node2/package 46400 1727204593.44319: worker is 1 (out of 1 available) 46400 1727204593.44333: exiting _queue_task() for managed-node2/package 46400 1727204593.44345: done queuing things up, now waiting for results queue to drain 46400 1727204593.44347: waiting for pending results... 46400 1727204593.44549: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204593.44666: in run() - task 0affcd87-79f5-1303-fda8-0000000019c7 46400 1727204593.44672: variable 'ansible_search_path' from source: unknown 46400 1727204593.44679: variable 'ansible_search_path' from source: unknown 46400 1727204593.44715: calling self._execute() 46400 1727204593.44800: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.44804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.44814: variable 'omit' from source: magic vars 46400 1727204593.45096: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.45105: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.45259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204593.45462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204593.45497: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204593.45523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204593.45610: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204593.45718: variable 'network_packages' from source: role '' defaults 46400 1727204593.45823: variable '__network_provider_setup' from source: role '' defaults 46400 1727204593.45839: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204593.45906: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204593.45922: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204593.45987: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204593.46169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.48445: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.48502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.48534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.48561: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.48582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.48645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.48666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.48685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.48713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.48724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.48758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.48777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.48794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.48821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.48831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.48982: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204593.49059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.49084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.49101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.49125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.49139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.49204: variable 'ansible_python' from source: facts 46400 1727204593.49218: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204593.49282: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204593.49341: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204593.49436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.49452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.49476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.49505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.49516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.49548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.49574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.49591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.49619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.49632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.49739: variable 'network_connections' from source: include params 46400 1727204593.49745: variable 'interface' from source: play vars 46400 1727204593.49821: variable 'interface' from source: play vars 46400 1727204593.49875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204593.49898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204593.49917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.49944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204593.49980: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.50172: variable 'network_connections' from source: include params 46400 1727204593.50175: variable 'interface' from source: play vars 46400 1727204593.50257: variable 'interface' from source: play vars 46400 1727204593.50298: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204593.50392: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.50696: variable 'network_connections' from source: include params 46400 1727204593.50706: variable 'interface' from source: play vars 46400 1727204593.50779: variable 'interface' from source: play vars 46400 1727204593.50805: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204593.50893: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204593.51200: variable 'network_connections' from source: include params 46400 1727204593.51209: variable 'interface' from source: play vars 46400 1727204593.51281: variable 'interface' from source: play vars 46400 1727204593.51342: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204593.51398: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204593.51402: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204593.51455: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204593.51647: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204593.52159: variable 'network_connections' from source: include params 46400 1727204593.52172: variable 'interface' from source: play vars 46400 1727204593.52235: variable 'interface' from source: play vars 46400 1727204593.52247: variable 'ansible_distribution' from source: facts 46400 1727204593.52255: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.52270: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.52287: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204593.52457: variable 'ansible_distribution' from source: facts 46400 1727204593.52468: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.52478: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.52494: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204593.52663: variable 'ansible_distribution' from source: facts 46400 1727204593.52677: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.52687: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.52730: variable 'network_provider' from source: set_fact 46400 1727204593.52752: variable 'ansible_facts' from source: unknown 46400 1727204593.53298: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204593.53305: when evaluation is False, skipping this task 46400 1727204593.53315: _execute() done 46400 1727204593.53318: dumping result to json 46400 1727204593.53321: done dumping result, returning 46400 1727204593.53329: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-0000000019c7] 46400 1727204593.53334: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c7 46400 1727204593.53432: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c7 46400 1727204593.53434: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204593.53501: no more pending results, returning what we have 46400 1727204593.53505: results queue empty 46400 1727204593.53506: checking for any_errors_fatal 46400 1727204593.53514: done checking for any_errors_fatal 46400 1727204593.53514: checking for max_fail_percentage 46400 1727204593.53516: done checking for max_fail_percentage 46400 1727204593.53517: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.53518: done checking to see if all hosts have failed 46400 1727204593.53519: getting the remaining hosts for this loop 46400 1727204593.53521: done getting the remaining hosts for this loop 46400 1727204593.53525: getting the next task for host managed-node2 46400 1727204593.53533: done getting next task for host managed-node2 46400 1727204593.53537: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204593.53542: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.53571: getting variables 46400 1727204593.53573: in VariableManager get_vars() 46400 1727204593.53618: Calling all_inventory to load vars for managed-node2 46400 1727204593.53620: Calling groups_inventory to load vars for managed-node2 46400 1727204593.53623: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.53633: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.53635: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.53638: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.54640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.55632: done with get_vars() 46400 1727204593.55657: done getting variables 46400 1727204593.55730: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.116) 0:01:23.842 ***** 46400 1727204593.55768: entering _queue_task() for managed-node2/package 46400 1727204593.56147: worker is 1 (out of 1 available) 46400 1727204593.56159: exiting _queue_task() for managed-node2/package 46400 1727204593.56174: done queuing things up, now waiting for results queue to drain 46400 1727204593.56176: waiting for pending results... 46400 1727204593.56500: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204593.56684: in run() - task 0affcd87-79f5-1303-fda8-0000000019c8 46400 1727204593.56706: variable 'ansible_search_path' from source: unknown 46400 1727204593.56715: variable 'ansible_search_path' from source: unknown 46400 1727204593.56760: calling self._execute() 46400 1727204593.56885: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.56891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.56900: variable 'omit' from source: magic vars 46400 1727204593.57186: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.57195: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.57298: variable 'network_state' from source: role '' defaults 46400 1727204593.57307: Evaluated conditional (network_state != {}): False 46400 1727204593.57310: when evaluation is False, skipping this task 46400 1727204593.57316: _execute() done 46400 1727204593.57323: dumping result to json 46400 1727204593.57325: done dumping result, returning 46400 1727204593.57333: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000019c8] 46400 1727204593.57338: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c8 46400 1727204593.57437: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c8 46400 1727204593.57440: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204593.57492: no more pending results, returning what we have 46400 1727204593.57496: results queue empty 46400 1727204593.57497: checking for any_errors_fatal 46400 1727204593.57506: done checking for any_errors_fatal 46400 1727204593.57507: checking for max_fail_percentage 46400 1727204593.57508: done checking for max_fail_percentage 46400 1727204593.57509: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.57510: done checking to see if all hosts have failed 46400 1727204593.57511: getting the remaining hosts for this loop 46400 1727204593.57512: done getting the remaining hosts for this loop 46400 1727204593.57516: getting the next task for host managed-node2 46400 1727204593.57526: done getting next task for host managed-node2 46400 1727204593.57530: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204593.57536: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.57565: getting variables 46400 1727204593.57567: in VariableManager get_vars() 46400 1727204593.57604: Calling all_inventory to load vars for managed-node2 46400 1727204593.57607: Calling groups_inventory to load vars for managed-node2 46400 1727204593.57609: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.57619: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.57622: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.57625: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.58456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.59509: done with get_vars() 46400 1727204593.59527: done getting variables 46400 1727204593.59575: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.038) 0:01:23.880 ***** 46400 1727204593.59600: entering _queue_task() for managed-node2/package 46400 1727204593.59834: worker is 1 (out of 1 available) 46400 1727204593.59848: exiting _queue_task() for managed-node2/package 46400 1727204593.59860: done queuing things up, now waiting for results queue to drain 46400 1727204593.59862: waiting for pending results... 46400 1727204593.60057: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204593.60170: in run() - task 0affcd87-79f5-1303-fda8-0000000019c9 46400 1727204593.60180: variable 'ansible_search_path' from source: unknown 46400 1727204593.60185: variable 'ansible_search_path' from source: unknown 46400 1727204593.60215: calling self._execute() 46400 1727204593.60291: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.60297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.60304: variable 'omit' from source: magic vars 46400 1727204593.60586: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.60595: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.60682: variable 'network_state' from source: role '' defaults 46400 1727204593.60692: Evaluated conditional (network_state != {}): False 46400 1727204593.60695: when evaluation is False, skipping this task 46400 1727204593.60698: _execute() done 46400 1727204593.60700: dumping result to json 46400 1727204593.60703: done dumping result, returning 46400 1727204593.60709: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000019c9] 46400 1727204593.60715: sending task result for task 0affcd87-79f5-1303-fda8-0000000019c9 46400 1727204593.60808: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019c9 46400 1727204593.60811: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204593.60856: no more pending results, returning what we have 46400 1727204593.60863: results queue empty 46400 1727204593.60865: checking for any_errors_fatal 46400 1727204593.60875: done checking for any_errors_fatal 46400 1727204593.60876: checking for max_fail_percentage 46400 1727204593.60877: done checking for max_fail_percentage 46400 1727204593.60878: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.60879: done checking to see if all hosts have failed 46400 1727204593.60879: getting the remaining hosts for this loop 46400 1727204593.60881: done getting the remaining hosts for this loop 46400 1727204593.60921: getting the next task for host managed-node2 46400 1727204593.61013: done getting next task for host managed-node2 46400 1727204593.61018: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204593.61024: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.61044: getting variables 46400 1727204593.61046: in VariableManager get_vars() 46400 1727204593.61085: Calling all_inventory to load vars for managed-node2 46400 1727204593.61088: Calling groups_inventory to load vars for managed-node2 46400 1727204593.61090: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.61099: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.61101: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.61104: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.61920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.63516: done with get_vars() 46400 1727204593.63543: done getting variables 46400 1727204593.63617: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.040) 0:01:23.921 ***** 46400 1727204593.63656: entering _queue_task() for managed-node2/service 46400 1727204593.64070: worker is 1 (out of 1 available) 46400 1727204593.64087: exiting _queue_task() for managed-node2/service 46400 1727204593.64117: done queuing things up, now waiting for results queue to drain 46400 1727204593.64122: waiting for pending results... 46400 1727204593.64338: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204593.64432: in run() - task 0affcd87-79f5-1303-fda8-0000000019ca 46400 1727204593.64444: variable 'ansible_search_path' from source: unknown 46400 1727204593.64447: variable 'ansible_search_path' from source: unknown 46400 1727204593.64482: calling self._execute() 46400 1727204593.64572: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.64579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.64582: variable 'omit' from source: magic vars 46400 1727204593.64857: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.64871: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.64955: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.65097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.67367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.67445: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.67500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.67541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.67579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.67675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.67733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.67766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.67822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.67843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.67892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.67931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.67962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.68009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.68041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.68089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.68117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.68157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.68202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.68219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.68418: variable 'network_connections' from source: include params 46400 1727204593.68434: variable 'interface' from source: play vars 46400 1727204593.68517: variable 'interface' from source: play vars 46400 1727204593.68604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204593.68753: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204593.68804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204593.68840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204593.68874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204593.68929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204593.68953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204593.68984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.69022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204593.69074: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204593.69346: variable 'network_connections' from source: include params 46400 1727204593.69357: variable 'interface' from source: play vars 46400 1727204593.69425: variable 'interface' from source: play vars 46400 1727204593.69466: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204593.69476: when evaluation is False, skipping this task 46400 1727204593.69483: _execute() done 46400 1727204593.69489: dumping result to json 46400 1727204593.69496: done dumping result, returning 46400 1727204593.69507: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000019ca] 46400 1727204593.69517: sending task result for task 0affcd87-79f5-1303-fda8-0000000019ca 46400 1727204593.69636: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019ca 46400 1727204593.69654: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204593.69710: no more pending results, returning what we have 46400 1727204593.69715: results queue empty 46400 1727204593.69716: checking for any_errors_fatal 46400 1727204593.69725: done checking for any_errors_fatal 46400 1727204593.69726: checking for max_fail_percentage 46400 1727204593.69728: done checking for max_fail_percentage 46400 1727204593.69730: checking to see if all hosts have failed and the running result is not ok 46400 1727204593.69730: done checking to see if all hosts have failed 46400 1727204593.69731: getting the remaining hosts for this loop 46400 1727204593.69733: done getting the remaining hosts for this loop 46400 1727204593.69739: getting the next task for host managed-node2 46400 1727204593.69748: done getting next task for host managed-node2 46400 1727204593.69754: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204593.69760: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204593.69794: getting variables 46400 1727204593.69797: in VariableManager get_vars() 46400 1727204593.69846: Calling all_inventory to load vars for managed-node2 46400 1727204593.69849: Calling groups_inventory to load vars for managed-node2 46400 1727204593.69852: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204593.69866: Calling all_plugins_play to load vars for managed-node2 46400 1727204593.69869: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204593.69872: Calling groups_plugins_play to load vars for managed-node2 46400 1727204593.71980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204593.78907: done with get_vars() 46400 1727204593.78938: done getting variables 46400 1727204593.78996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.153) 0:01:24.074 ***** 46400 1727204593.79027: entering _queue_task() for managed-node2/service 46400 1727204593.79399: worker is 1 (out of 1 available) 46400 1727204593.79414: exiting _queue_task() for managed-node2/service 46400 1727204593.79429: done queuing things up, now waiting for results queue to drain 46400 1727204593.79430: waiting for pending results... 46400 1727204593.80106: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204593.80282: in run() - task 0affcd87-79f5-1303-fda8-0000000019cb 46400 1727204593.80293: variable 'ansible_search_path' from source: unknown 46400 1727204593.80297: variable 'ansible_search_path' from source: unknown 46400 1727204593.80341: calling self._execute() 46400 1727204593.80452: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.80457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.80489: variable 'omit' from source: magic vars 46400 1727204593.80895: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.80910: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204593.81091: variable 'network_provider' from source: set_fact 46400 1727204593.81102: variable 'network_state' from source: role '' defaults 46400 1727204593.81120: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204593.81124: variable 'omit' from source: magic vars 46400 1727204593.81195: variable 'omit' from source: magic vars 46400 1727204593.81230: variable 'network_service_name' from source: role '' defaults 46400 1727204593.81307: variable 'network_service_name' from source: role '' defaults 46400 1727204593.81427: variable '__network_provider_setup' from source: role '' defaults 46400 1727204593.81434: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204593.81506: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204593.81515: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204593.81592: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204593.82151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204593.85050: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204593.85139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204593.85287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204593.85291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204593.85293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204593.85573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.86290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.86434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.86481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.86496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.86657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.86684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.86709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.86979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.86993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.87628: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204593.87876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.87901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.87925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.88086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.88103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.88320: variable 'ansible_python' from source: facts 46400 1727204593.88337: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204593.88499: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204593.88586: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204593.88729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.88753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.88783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.88830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.88848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.88900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204593.88930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204593.88957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.88999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204593.89014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204593.89174: variable 'network_connections' from source: include params 46400 1727204593.89182: variable 'interface' from source: play vars 46400 1727204593.89269: variable 'interface' from source: play vars 46400 1727204593.89386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204593.89638: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204593.90321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204593.90362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204593.90455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204593.90518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204593.90661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204593.90694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204593.90723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204593.90894: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.91507: variable 'network_connections' from source: include params 46400 1727204593.91514: variable 'interface' from source: play vars 46400 1727204593.91594: variable 'interface' from source: play vars 46400 1727204593.91743: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204593.91940: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204593.92477: variable 'network_connections' from source: include params 46400 1727204593.92602: variable 'interface' from source: play vars 46400 1727204593.92671: variable 'interface' from source: play vars 46400 1727204593.92694: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204593.92887: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204593.93547: variable 'network_connections' from source: include params 46400 1727204593.93551: variable 'interface' from source: play vars 46400 1727204593.93646: variable 'interface' from source: play vars 46400 1727204593.93714: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204593.93779: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204593.93788: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204593.93854: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204593.94096: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204593.94674: variable 'network_connections' from source: include params 46400 1727204593.94679: variable 'interface' from source: play vars 46400 1727204593.94739: variable 'interface' from source: play vars 46400 1727204593.94746: variable 'ansible_distribution' from source: facts 46400 1727204593.94749: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.94757: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.94784: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204593.94961: variable 'ansible_distribution' from source: facts 46400 1727204593.94969: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.94976: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.94997: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204593.95180: variable 'ansible_distribution' from source: facts 46400 1727204593.95183: variable '__network_rh_distros' from source: role '' defaults 46400 1727204593.95186: variable 'ansible_distribution_major_version' from source: facts 46400 1727204593.95234: variable 'network_provider' from source: set_fact 46400 1727204593.95256: variable 'omit' from source: magic vars 46400 1727204593.95291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204593.95326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204593.95347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204593.95369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204593.95380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204593.95409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204593.95415: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.95428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.95529: Set connection var ansible_shell_type to sh 46400 1727204593.95547: Set connection var ansible_shell_executable to /bin/sh 46400 1727204593.95553: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204593.95558: Set connection var ansible_connection to ssh 46400 1727204593.95567: Set connection var ansible_pipelining to False 46400 1727204593.95574: Set connection var ansible_timeout to 10 46400 1727204593.95601: variable 'ansible_shell_executable' from source: unknown 46400 1727204593.95605: variable 'ansible_connection' from source: unknown 46400 1727204593.95607: variable 'ansible_module_compression' from source: unknown 46400 1727204593.95610: variable 'ansible_shell_type' from source: unknown 46400 1727204593.95612: variable 'ansible_shell_executable' from source: unknown 46400 1727204593.95614: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204593.95618: variable 'ansible_pipelining' from source: unknown 46400 1727204593.95620: variable 'ansible_timeout' from source: unknown 46400 1727204593.95625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204593.95738: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204593.95755: variable 'omit' from source: magic vars 46400 1727204593.95760: starting attempt loop 46400 1727204593.95767: running the handler 46400 1727204593.95847: variable 'ansible_facts' from source: unknown 46400 1727204593.96717: _low_level_execute_command(): starting 46400 1727204593.96724: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204593.97521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204593.97534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204593.97543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204593.97559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204593.97604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204593.97620: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204593.97629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204593.97644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204593.97652: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204593.97659: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204593.97673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204593.97682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204593.97694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204593.97702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204593.97708: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204593.97721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204593.97799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204593.97819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204593.97834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204593.97919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204593.99582: stdout chunk (state=3): >>>/root <<< 46400 1727204593.99727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204593.99784: stderr chunk (state=3): >>><<< 46400 1727204593.99787: stdout chunk (state=3): >>><<< 46400 1727204593.99812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204593.99825: _low_level_execute_command(): starting 46400 1727204593.99830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539 `" && echo ansible-tmp-1727204593.9981196-52404-226363565843539="` echo /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539 `" ) && sleep 0' 46400 1727204594.00485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.00494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.00504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.00517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.00556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.00560: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.00578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.00591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.00597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.00603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.00611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.00620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.00631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.00637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.00643: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.00652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.00723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.00737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.00741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.00819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.02692: stdout chunk (state=3): >>>ansible-tmp-1727204593.9981196-52404-226363565843539=/root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539 <<< 46400 1727204594.02892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.02923: stderr chunk (state=3): >>><<< 46400 1727204594.02927: stdout chunk (state=3): >>><<< 46400 1727204594.03122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204593.9981196-52404-226363565843539=/root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.03136: variable 'ansible_module_compression' from source: unknown 46400 1727204594.03139: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204594.03141: variable 'ansible_facts' from source: unknown 46400 1727204594.03357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/AnsiballZ_systemd.py 46400 1727204594.03535: Sending initial data 46400 1727204594.03538: Sent initial data (156 bytes) 46400 1727204594.04715: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.04733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.04761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.04785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.04828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.04842: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.04865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.04890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.04903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.04915: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.04929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.04944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.04969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.04988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.05000: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.05014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.05104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.05128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.05144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.05221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.06956: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204594.07015: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204594.07025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp5uzfioog /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/AnsiballZ_systemd.py <<< 46400 1727204594.07051: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204594.09736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.09955: stderr chunk (state=3): >>><<< 46400 1727204594.09959: stdout chunk (state=3): >>><<< 46400 1727204594.09961: done transferring module to remote 46400 1727204594.09969: _low_level_execute_command(): starting 46400 1727204594.09973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/ /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/AnsiballZ_systemd.py && sleep 0' 46400 1727204594.10622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.10638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.10653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.10674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.10733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.10747: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.10762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.10783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.10796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.10807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.10827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.10848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.10866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.10878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.10888: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.10899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.10985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.11007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.11021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.11098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.12904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.12909: stdout chunk (state=3): >>><<< 46400 1727204594.12911: stderr chunk (state=3): >>><<< 46400 1727204594.13016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.13025: _low_level_execute_command(): starting 46400 1727204594.13028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/AnsiballZ_systemd.py && sleep 0' 46400 1727204594.13660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.13690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.13705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.13722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.13766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.13785: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.13800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.13817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.13828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.13837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.13848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.13861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.13878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.13895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.13905: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.13918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.13992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.14022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.14039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.14122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.39136: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204594.39166: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2196330000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204594.39175: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204594.40634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204594.40699: stderr chunk (state=3): >>><<< 46400 1727204594.40703: stdout chunk (state=3): >>><<< 46400 1727204594.40726: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2196330000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204594.40958: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204594.40967: _low_level_execute_command(): starting 46400 1727204594.40970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204593.9981196-52404-226363565843539/ > /dev/null 2>&1 && sleep 0' 46400 1727204594.41675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.41690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.41705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.41723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.41775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.41787: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.41801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.41819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.41832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.41849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.41870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.41885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.41900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.41912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.41924: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.41937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.42020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.42036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.42052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.42127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.43918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.43974: stderr chunk (state=3): >>><<< 46400 1727204594.43978: stdout chunk (state=3): >>><<< 46400 1727204594.43995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.44003: handler run complete 46400 1727204594.44046: attempt loop complete, returning result 46400 1727204594.44050: _execute() done 46400 1727204594.44052: dumping result to json 46400 1727204594.44067: done dumping result, returning 46400 1727204594.44075: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-0000000019cb] 46400 1727204594.44081: sending task result for task 0affcd87-79f5-1303-fda8-0000000019cb 46400 1727204594.44321: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019cb 46400 1727204594.44324: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204594.44391: no more pending results, returning what we have 46400 1727204594.44395: results queue empty 46400 1727204594.44396: checking for any_errors_fatal 46400 1727204594.44402: done checking for any_errors_fatal 46400 1727204594.44402: checking for max_fail_percentage 46400 1727204594.44404: done checking for max_fail_percentage 46400 1727204594.44405: checking to see if all hosts have failed and the running result is not ok 46400 1727204594.44406: done checking to see if all hosts have failed 46400 1727204594.44406: getting the remaining hosts for this loop 46400 1727204594.44408: done getting the remaining hosts for this loop 46400 1727204594.44412: getting the next task for host managed-node2 46400 1727204594.44419: done getting next task for host managed-node2 46400 1727204594.44423: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204594.44428: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204594.44442: getting variables 46400 1727204594.44444: in VariableManager get_vars() 46400 1727204594.44483: Calling all_inventory to load vars for managed-node2 46400 1727204594.44486: Calling groups_inventory to load vars for managed-node2 46400 1727204594.44488: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204594.44499: Calling all_plugins_play to load vars for managed-node2 46400 1727204594.44501: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204594.44504: Calling groups_plugins_play to load vars for managed-node2 46400 1727204594.45901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204594.47758: done with get_vars() 46400 1727204594.47800: done getting variables 46400 1727204594.47883: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.688) 0:01:24.763 ***** 46400 1727204594.47923: entering _queue_task() for managed-node2/service 46400 1727204594.48329: worker is 1 (out of 1 available) 46400 1727204594.48344: exiting _queue_task() for managed-node2/service 46400 1727204594.48357: done queuing things up, now waiting for results queue to drain 46400 1727204594.48362: waiting for pending results... 46400 1727204594.48983: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204594.48989: in run() - task 0affcd87-79f5-1303-fda8-0000000019cc 46400 1727204594.48992: variable 'ansible_search_path' from source: unknown 46400 1727204594.48995: variable 'ansible_search_path' from source: unknown 46400 1727204594.48998: calling self._execute() 46400 1727204594.49076: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.49080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.49090: variable 'omit' from source: magic vars 46400 1727204594.49519: variable 'ansible_distribution_major_version' from source: facts 46400 1727204594.49530: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204594.49653: variable 'network_provider' from source: set_fact 46400 1727204594.49658: Evaluated conditional (network_provider == "nm"): True 46400 1727204594.49767: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204594.49860: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204594.50071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204594.53170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204594.53244: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204594.53282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204594.53315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204594.53335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204594.53404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204594.53426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204594.53443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204594.53476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204594.53487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204594.53523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204594.53540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204594.53556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204594.53594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204594.53605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204594.53638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204594.53654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204594.53675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204594.53701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204594.53711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204594.53817: variable 'network_connections' from source: include params 46400 1727204594.53830: variable 'interface' from source: play vars 46400 1727204594.53885: variable 'interface' from source: play vars 46400 1727204594.53939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204594.54071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204594.54098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204594.54122: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204594.54144: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204594.54183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204594.54199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204594.54216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204594.54236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204594.54279: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204594.54446: variable 'network_connections' from source: include params 46400 1727204594.54450: variable 'interface' from source: play vars 46400 1727204594.54500: variable 'interface' from source: play vars 46400 1727204594.54523: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204594.54526: when evaluation is False, skipping this task 46400 1727204594.54529: _execute() done 46400 1727204594.54531: dumping result to json 46400 1727204594.54534: done dumping result, returning 46400 1727204594.54540: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-0000000019cc] 46400 1727204594.54555: sending task result for task 0affcd87-79f5-1303-fda8-0000000019cc skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204594.54698: no more pending results, returning what we have 46400 1727204594.54704: results queue empty 46400 1727204594.54705: checking for any_errors_fatal 46400 1727204594.54723: done checking for any_errors_fatal 46400 1727204594.54723: checking for max_fail_percentage 46400 1727204594.54725: done checking for max_fail_percentage 46400 1727204594.54726: checking to see if all hosts have failed and the running result is not ok 46400 1727204594.54727: done checking to see if all hosts have failed 46400 1727204594.54728: getting the remaining hosts for this loop 46400 1727204594.54729: done getting the remaining hosts for this loop 46400 1727204594.54733: getting the next task for host managed-node2 46400 1727204594.54741: done getting next task for host managed-node2 46400 1727204594.54745: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204594.54750: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204594.54784: getting variables 46400 1727204594.54786: in VariableManager get_vars() 46400 1727204594.54828: Calling all_inventory to load vars for managed-node2 46400 1727204594.54831: Calling groups_inventory to load vars for managed-node2 46400 1727204594.54833: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204594.54844: Calling all_plugins_play to load vars for managed-node2 46400 1727204594.54846: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204594.54849: Calling groups_plugins_play to load vars for managed-node2 46400 1727204594.55562: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019cc 46400 1727204594.55568: WORKER PROCESS EXITING 46400 1727204594.56185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204594.57452: done with get_vars() 46400 1727204594.57476: done getting variables 46400 1727204594.57522: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.096) 0:01:24.860 ***** 46400 1727204594.57549: entering _queue_task() for managed-node2/service 46400 1727204594.57807: worker is 1 (out of 1 available) 46400 1727204594.57823: exiting _queue_task() for managed-node2/service 46400 1727204594.57837: done queuing things up, now waiting for results queue to drain 46400 1727204594.57838: waiting for pending results... 46400 1727204594.58037: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204594.58144: in run() - task 0affcd87-79f5-1303-fda8-0000000019cd 46400 1727204594.58155: variable 'ansible_search_path' from source: unknown 46400 1727204594.58159: variable 'ansible_search_path' from source: unknown 46400 1727204594.58194: calling self._execute() 46400 1727204594.58279: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.58284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.58293: variable 'omit' from source: magic vars 46400 1727204594.58597: variable 'ansible_distribution_major_version' from source: facts 46400 1727204594.58607: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204594.58702: variable 'network_provider' from source: set_fact 46400 1727204594.58706: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204594.58710: when evaluation is False, skipping this task 46400 1727204594.58712: _execute() done 46400 1727204594.58715: dumping result to json 46400 1727204594.58718: done dumping result, returning 46400 1727204594.58725: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-0000000019cd] 46400 1727204594.58732: sending task result for task 0affcd87-79f5-1303-fda8-0000000019cd 46400 1727204594.58825: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019cd 46400 1727204594.58830: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204594.58883: no more pending results, returning what we have 46400 1727204594.58887: results queue empty 46400 1727204594.58888: checking for any_errors_fatal 46400 1727204594.58899: done checking for any_errors_fatal 46400 1727204594.58900: checking for max_fail_percentage 46400 1727204594.58902: done checking for max_fail_percentage 46400 1727204594.58903: checking to see if all hosts have failed and the running result is not ok 46400 1727204594.58904: done checking to see if all hosts have failed 46400 1727204594.58904: getting the remaining hosts for this loop 46400 1727204594.58906: done getting the remaining hosts for this loop 46400 1727204594.58910: getting the next task for host managed-node2 46400 1727204594.58919: done getting next task for host managed-node2 46400 1727204594.58923: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204594.58929: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204594.58958: getting variables 46400 1727204594.58961: in VariableManager get_vars() 46400 1727204594.59005: Calling all_inventory to load vars for managed-node2 46400 1727204594.59008: Calling groups_inventory to load vars for managed-node2 46400 1727204594.59010: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204594.59020: Calling all_plugins_play to load vars for managed-node2 46400 1727204594.59023: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204594.59025: Calling groups_plugins_play to load vars for managed-node2 46400 1727204594.60216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204594.62072: done with get_vars() 46400 1727204594.62096: done getting variables 46400 1727204594.62162: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.046) 0:01:24.906 ***** 46400 1727204594.62203: entering _queue_task() for managed-node2/copy 46400 1727204594.62571: worker is 1 (out of 1 available) 46400 1727204594.62584: exiting _queue_task() for managed-node2/copy 46400 1727204594.62598: done queuing things up, now waiting for results queue to drain 46400 1727204594.62600: waiting for pending results... 46400 1727204594.62910: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204594.63078: in run() - task 0affcd87-79f5-1303-fda8-0000000019ce 46400 1727204594.63097: variable 'ansible_search_path' from source: unknown 46400 1727204594.63104: variable 'ansible_search_path' from source: unknown 46400 1727204594.63147: calling self._execute() 46400 1727204594.63250: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.63267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.63283: variable 'omit' from source: magic vars 46400 1727204594.63691: variable 'ansible_distribution_major_version' from source: facts 46400 1727204594.63712: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204594.63841: variable 'network_provider' from source: set_fact 46400 1727204594.63852: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204594.63859: when evaluation is False, skipping this task 46400 1727204594.63867: _execute() done 46400 1727204594.63875: dumping result to json 46400 1727204594.63881: done dumping result, returning 46400 1727204594.63891: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-0000000019ce] 46400 1727204594.63903: sending task result for task 0affcd87-79f5-1303-fda8-0000000019ce skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204594.64069: no more pending results, returning what we have 46400 1727204594.64074: results queue empty 46400 1727204594.64075: checking for any_errors_fatal 46400 1727204594.64081: done checking for any_errors_fatal 46400 1727204594.64082: checking for max_fail_percentage 46400 1727204594.64084: done checking for max_fail_percentage 46400 1727204594.64085: checking to see if all hosts have failed and the running result is not ok 46400 1727204594.64085: done checking to see if all hosts have failed 46400 1727204594.64086: getting the remaining hosts for this loop 46400 1727204594.64088: done getting the remaining hosts for this loop 46400 1727204594.64092: getting the next task for host managed-node2 46400 1727204594.64102: done getting next task for host managed-node2 46400 1727204594.64107: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204594.64114: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204594.64146: getting variables 46400 1727204594.64148: in VariableManager get_vars() 46400 1727204594.64196: Calling all_inventory to load vars for managed-node2 46400 1727204594.64199: Calling groups_inventory to load vars for managed-node2 46400 1727204594.64202: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204594.64216: Calling all_plugins_play to load vars for managed-node2 46400 1727204594.64219: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204594.64222: Calling groups_plugins_play to load vars for managed-node2 46400 1727204594.65185: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019ce 46400 1727204594.65189: WORKER PROCESS EXITING 46400 1727204594.65994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204594.67713: done with get_vars() 46400 1727204594.67747: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.056) 0:01:24.963 ***** 46400 1727204594.67845: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204594.68200: worker is 1 (out of 1 available) 46400 1727204594.68212: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204594.68226: done queuing things up, now waiting for results queue to drain 46400 1727204594.68228: waiting for pending results... 46400 1727204594.68528: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204594.68711: in run() - task 0affcd87-79f5-1303-fda8-0000000019cf 46400 1727204594.68733: variable 'ansible_search_path' from source: unknown 46400 1727204594.68742: variable 'ansible_search_path' from source: unknown 46400 1727204594.68790: calling self._execute() 46400 1727204594.68900: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.68913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.68928: variable 'omit' from source: magic vars 46400 1727204594.69321: variable 'ansible_distribution_major_version' from source: facts 46400 1727204594.69341: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204594.69353: variable 'omit' from source: magic vars 46400 1727204594.69420: variable 'omit' from source: magic vars 46400 1727204594.69601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204594.72011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204594.72093: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204594.72137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204594.72184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204594.72218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204594.72316: variable 'network_provider' from source: set_fact 46400 1727204594.72453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204594.72494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204594.72526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204594.72578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204594.72604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204594.72686: variable 'omit' from source: magic vars 46400 1727204594.72811: variable 'omit' from source: magic vars 46400 1727204594.72925: variable 'network_connections' from source: include params 46400 1727204594.72943: variable 'interface' from source: play vars 46400 1727204594.73008: variable 'interface' from source: play vars 46400 1727204594.73169: variable 'omit' from source: magic vars 46400 1727204594.73184: variable '__lsr_ansible_managed' from source: task vars 46400 1727204594.73250: variable '__lsr_ansible_managed' from source: task vars 46400 1727204594.73449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204594.73958: Loaded config def from plugin (lookup/template) 46400 1727204594.73971: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204594.74004: File lookup term: get_ansible_managed.j2 46400 1727204594.74011: variable 'ansible_search_path' from source: unknown 46400 1727204594.74022: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204594.74046: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204594.74071: variable 'ansible_search_path' from source: unknown 46400 1727204594.81291: variable 'ansible_managed' from source: unknown 46400 1727204594.81703: variable 'omit' from source: magic vars 46400 1727204594.81772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204594.81810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204594.81975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204594.82028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204594.82044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204594.82081: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204594.82108: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.82116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.82245: Set connection var ansible_shell_type to sh 46400 1727204594.82259: Set connection var ansible_shell_executable to /bin/sh 46400 1727204594.82270: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204594.82279: Set connection var ansible_connection to ssh 46400 1727204594.82303: Set connection var ansible_pipelining to False 46400 1727204594.82319: Set connection var ansible_timeout to 10 46400 1727204594.82351: variable 'ansible_shell_executable' from source: unknown 46400 1727204594.82360: variable 'ansible_connection' from source: unknown 46400 1727204594.82371: variable 'ansible_module_compression' from source: unknown 46400 1727204594.82379: variable 'ansible_shell_type' from source: unknown 46400 1727204594.82386: variable 'ansible_shell_executable' from source: unknown 46400 1727204594.82394: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204594.82407: variable 'ansible_pipelining' from source: unknown 46400 1727204594.82414: variable 'ansible_timeout' from source: unknown 46400 1727204594.82425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204594.82570: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204594.82595: variable 'omit' from source: magic vars 46400 1727204594.82604: starting attempt loop 46400 1727204594.82610: running the handler 46400 1727204594.82632: _low_level_execute_command(): starting 46400 1727204594.82649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204594.83382: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.83405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.83420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.83438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.83485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.83501: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.83519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.83537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.83547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.83557: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.83571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.83585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.83602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.83618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.83633: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.83648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.83727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.83752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.83769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.83850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.85497: stdout chunk (state=3): >>>/root <<< 46400 1727204594.85699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.85703: stdout chunk (state=3): >>><<< 46400 1727204594.85705: stderr chunk (state=3): >>><<< 46400 1727204594.85823: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.85826: _low_level_execute_command(): starting 46400 1727204594.85829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837 `" && echo ansible-tmp-1727204594.857257-52440-209576967751837="` echo /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837 `" ) && sleep 0' 46400 1727204594.86494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.86509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.86524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.86544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.86600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.86614: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.86631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.86650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.86662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.86678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.86702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.86722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.86741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.86754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.86767: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.86780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.86870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.86893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.86910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.86995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.88869: stdout chunk (state=3): >>>ansible-tmp-1727204594.857257-52440-209576967751837=/root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837 <<< 46400 1727204594.89079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.89083: stdout chunk (state=3): >>><<< 46400 1727204594.89086: stderr chunk (state=3): >>><<< 46400 1727204594.89276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204594.857257-52440-209576967751837=/root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.89286: variable 'ansible_module_compression' from source: unknown 46400 1727204594.89289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204594.89291: variable 'ansible_facts' from source: unknown 46400 1727204594.89357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/AnsiballZ_network_connections.py 46400 1727204594.89528: Sending initial data 46400 1727204594.89535: Sent initial data (167 bytes) 46400 1727204594.90943: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.90966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.90982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.90999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.91052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.91067: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.91085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.91102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.91113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.91127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.91138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.91153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.91178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.91191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.91201: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.91213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.91310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.91332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.91351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.91475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.93302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204594.93341: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204594.93383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpo5yj8dxp /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/AnsiballZ_network_connections.py <<< 46400 1727204594.93419: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204594.95511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.95571: stderr chunk (state=3): >>><<< 46400 1727204594.95574: stdout chunk (state=3): >>><<< 46400 1727204594.95577: done transferring module to remote 46400 1727204594.95675: _low_level_execute_command(): starting 46400 1727204594.95679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/ /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/AnsiballZ_network_connections.py && sleep 0' 46400 1727204594.96925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204594.96940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.96954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.96979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.97022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.97091: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204594.97105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.97122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204594.97133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204594.97143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204594.97154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204594.97169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204594.97189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204594.97201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204594.97212: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204594.97310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204594.97387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204594.97407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204594.97422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204594.97566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204594.99392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204594.99397: stdout chunk (state=3): >>><<< 46400 1727204594.99399: stderr chunk (state=3): >>><<< 46400 1727204594.99470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204594.99479: _low_level_execute_command(): starting 46400 1727204594.99481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/AnsiballZ_network_connections.py && sleep 0' 46400 1727204595.00187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.00214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.00230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.00249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.00295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.00318: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.00336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.00354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.00370: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.00382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.00394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.00410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.00436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.00450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.00460: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.00674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.00907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.00910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.00922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.01008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.23130: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204595.24672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204595.24677: stdout chunk (state=3): >>><<< 46400 1727204595.24698: stderr chunk (state=3): >>><<< 46400 1727204595.24848: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204595.24852: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204595.24855: _low_level_execute_command(): starting 46400 1727204595.24857: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204594.857257-52440-209576967751837/ > /dev/null 2>&1 && sleep 0' 46400 1727204595.25841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.25849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.25859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.25879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.25922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.25929: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.25941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.25955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.25968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.25976: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.25984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.25994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.26009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.26013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.26020: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.26026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.26107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.26127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.26134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.26200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.28020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.28120: stderr chunk (state=3): >>><<< 46400 1727204595.28132: stdout chunk (state=3): >>><<< 46400 1727204595.28278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204595.28282: handler run complete 46400 1727204595.28284: attempt loop complete, returning result 46400 1727204595.28286: _execute() done 46400 1727204595.28288: dumping result to json 46400 1727204595.28290: done dumping result, returning 46400 1727204595.28293: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-0000000019cf] 46400 1727204595.28295: sending task result for task 0affcd87-79f5-1303-fda8-0000000019cf 46400 1727204595.28382: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019cf 46400 1727204595.28386: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active 46400 1727204595.28507: no more pending results, returning what we have 46400 1727204595.28511: results queue empty 46400 1727204595.28513: checking for any_errors_fatal 46400 1727204595.28520: done checking for any_errors_fatal 46400 1727204595.28521: checking for max_fail_percentage 46400 1727204595.28523: done checking for max_fail_percentage 46400 1727204595.28524: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.28525: done checking to see if all hosts have failed 46400 1727204595.28525: getting the remaining hosts for this loop 46400 1727204595.28527: done getting the remaining hosts for this loop 46400 1727204595.28531: getting the next task for host managed-node2 46400 1727204595.28542: done getting next task for host managed-node2 46400 1727204595.28546: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204595.28552: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.28675: getting variables 46400 1727204595.28678: in VariableManager get_vars() 46400 1727204595.28722: Calling all_inventory to load vars for managed-node2 46400 1727204595.28725: Calling groups_inventory to load vars for managed-node2 46400 1727204595.28728: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.28739: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.28742: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.28746: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.30820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.32611: done with get_vars() 46400 1727204595.32644: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.649) 0:01:25.612 ***** 46400 1727204595.32750: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204595.33143: worker is 1 (out of 1 available) 46400 1727204595.33157: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204595.33171: done queuing things up, now waiting for results queue to drain 46400 1727204595.33173: waiting for pending results... 46400 1727204595.33497: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204595.33686: in run() - task 0affcd87-79f5-1303-fda8-0000000019d0 46400 1727204595.33706: variable 'ansible_search_path' from source: unknown 46400 1727204595.33713: variable 'ansible_search_path' from source: unknown 46400 1727204595.33758: calling self._execute() 46400 1727204595.33884: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.33902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.33917: variable 'omit' from source: magic vars 46400 1727204595.34349: variable 'ansible_distribution_major_version' from source: facts 46400 1727204595.34370: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204595.34511: variable 'network_state' from source: role '' defaults 46400 1727204595.34530: Evaluated conditional (network_state != {}): False 46400 1727204595.34545: when evaluation is False, skipping this task 46400 1727204595.34553: _execute() done 46400 1727204595.34561: dumping result to json 46400 1727204595.34570: done dumping result, returning 46400 1727204595.34581: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-0000000019d0] 46400 1727204595.34594: sending task result for task 0affcd87-79f5-1303-fda8-0000000019d0 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204595.34770: no more pending results, returning what we have 46400 1727204595.34775: results queue empty 46400 1727204595.34776: checking for any_errors_fatal 46400 1727204595.34792: done checking for any_errors_fatal 46400 1727204595.34793: checking for max_fail_percentage 46400 1727204595.34795: done checking for max_fail_percentage 46400 1727204595.34797: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.34797: done checking to see if all hosts have failed 46400 1727204595.34798: getting the remaining hosts for this loop 46400 1727204595.34800: done getting the remaining hosts for this loop 46400 1727204595.34804: getting the next task for host managed-node2 46400 1727204595.34815: done getting next task for host managed-node2 46400 1727204595.34821: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204595.34828: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.34859: getting variables 46400 1727204595.34861: in VariableManager get_vars() 46400 1727204595.34914: Calling all_inventory to load vars for managed-node2 46400 1727204595.34918: Calling groups_inventory to load vars for managed-node2 46400 1727204595.34921: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.34935: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.34938: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.34941: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.35918: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019d0 46400 1727204595.35921: WORKER PROCESS EXITING 46400 1727204595.36833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.38682: done with get_vars() 46400 1727204595.38715: done getting variables 46400 1727204595.38788: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.060) 0:01:25.672 ***** 46400 1727204595.38824: entering _queue_task() for managed-node2/debug 46400 1727204595.39207: worker is 1 (out of 1 available) 46400 1727204595.39221: exiting _queue_task() for managed-node2/debug 46400 1727204595.39238: done queuing things up, now waiting for results queue to drain 46400 1727204595.39239: waiting for pending results... 46400 1727204595.39589: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204595.39793: in run() - task 0affcd87-79f5-1303-fda8-0000000019d1 46400 1727204595.39813: variable 'ansible_search_path' from source: unknown 46400 1727204595.39821: variable 'ansible_search_path' from source: unknown 46400 1727204595.39872: calling self._execute() 46400 1727204595.39994: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.40011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.40025: variable 'omit' from source: magic vars 46400 1727204595.40456: variable 'ansible_distribution_major_version' from source: facts 46400 1727204595.40478: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204595.40498: variable 'omit' from source: magic vars 46400 1727204595.40575: variable 'omit' from source: magic vars 46400 1727204595.40624: variable 'omit' from source: magic vars 46400 1727204595.40678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204595.40728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204595.40758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204595.40787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.40803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.40845: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204595.40854: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.40863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.40981: Set connection var ansible_shell_type to sh 46400 1727204595.41001: Set connection var ansible_shell_executable to /bin/sh 46400 1727204595.41012: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204595.41021: Set connection var ansible_connection to ssh 46400 1727204595.41040: Set connection var ansible_pipelining to False 46400 1727204595.41051: Set connection var ansible_timeout to 10 46400 1727204595.41082: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.41090: variable 'ansible_connection' from source: unknown 46400 1727204595.41101: variable 'ansible_module_compression' from source: unknown 46400 1727204595.41109: variable 'ansible_shell_type' from source: unknown 46400 1727204595.41116: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.41123: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.41130: variable 'ansible_pipelining' from source: unknown 46400 1727204595.41144: variable 'ansible_timeout' from source: unknown 46400 1727204595.41156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.41321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204595.41338: variable 'omit' from source: magic vars 46400 1727204595.41348: starting attempt loop 46400 1727204595.41359: running the handler 46400 1727204595.41506: variable '__network_connections_result' from source: set_fact 46400 1727204595.41577: handler run complete 46400 1727204595.41604: attempt loop complete, returning result 46400 1727204595.41612: _execute() done 46400 1727204595.41620: dumping result to json 46400 1727204595.41627: done dumping result, returning 46400 1727204595.41644: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-0000000019d1] 46400 1727204595.41656: sending task result for task 0affcd87-79f5-1303-fda8-0000000019d1 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active" ] } 46400 1727204595.41853: no more pending results, returning what we have 46400 1727204595.41858: results queue empty 46400 1727204595.41859: checking for any_errors_fatal 46400 1727204595.41867: done checking for any_errors_fatal 46400 1727204595.41868: checking for max_fail_percentage 46400 1727204595.41870: done checking for max_fail_percentage 46400 1727204595.41871: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.41872: done checking to see if all hosts have failed 46400 1727204595.41873: getting the remaining hosts for this loop 46400 1727204595.41875: done getting the remaining hosts for this loop 46400 1727204595.41879: getting the next task for host managed-node2 46400 1727204595.41889: done getting next task for host managed-node2 46400 1727204595.41893: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204595.41899: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.41915: getting variables 46400 1727204595.41917: in VariableManager get_vars() 46400 1727204595.41963: Calling all_inventory to load vars for managed-node2 46400 1727204595.41968: Calling groups_inventory to load vars for managed-node2 46400 1727204595.41971: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.41982: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.41985: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.41989: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.43043: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019d1 46400 1727204595.43046: WORKER PROCESS EXITING 46400 1727204595.44028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.45798: done with get_vars() 46400 1727204595.45830: done getting variables 46400 1727204595.45906: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.071) 0:01:25.744 ***** 46400 1727204595.45950: entering _queue_task() for managed-node2/debug 46400 1727204595.46337: worker is 1 (out of 1 available) 46400 1727204595.46349: exiting _queue_task() for managed-node2/debug 46400 1727204595.46363: done queuing things up, now waiting for results queue to drain 46400 1727204595.46366: waiting for pending results... 46400 1727204595.46699: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204595.46892: in run() - task 0affcd87-79f5-1303-fda8-0000000019d2 46400 1727204595.46913: variable 'ansible_search_path' from source: unknown 46400 1727204595.46927: variable 'ansible_search_path' from source: unknown 46400 1727204595.46978: calling self._execute() 46400 1727204595.47097: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.47109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.47123: variable 'omit' from source: magic vars 46400 1727204595.47550: variable 'ansible_distribution_major_version' from source: facts 46400 1727204595.47573: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204595.47589: variable 'omit' from source: magic vars 46400 1727204595.47669: variable 'omit' from source: magic vars 46400 1727204595.47722: variable 'omit' from source: magic vars 46400 1727204595.47774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204595.47826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204595.47858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204595.47885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.47904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.47949: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204595.47958: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.47968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.48090: Set connection var ansible_shell_type to sh 46400 1727204595.48106: Set connection var ansible_shell_executable to /bin/sh 46400 1727204595.48116: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204595.48131: Set connection var ansible_connection to ssh 46400 1727204595.48141: Set connection var ansible_pipelining to False 46400 1727204595.48158: Set connection var ansible_timeout to 10 46400 1727204595.48193: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.48201: variable 'ansible_connection' from source: unknown 46400 1727204595.48209: variable 'ansible_module_compression' from source: unknown 46400 1727204595.48216: variable 'ansible_shell_type' from source: unknown 46400 1727204595.48223: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.48232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.48244: variable 'ansible_pipelining' from source: unknown 46400 1727204595.48253: variable 'ansible_timeout' from source: unknown 46400 1727204595.48267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.48430: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204595.48449: variable 'omit' from source: magic vars 46400 1727204595.48463: starting attempt loop 46400 1727204595.48473: running the handler 46400 1727204595.48531: variable '__network_connections_result' from source: set_fact 46400 1727204595.48632: variable '__network_connections_result' from source: set_fact 46400 1727204595.48760: handler run complete 46400 1727204595.48798: attempt loop complete, returning result 46400 1727204595.48810: _execute() done 46400 1727204595.48822: dumping result to json 46400 1727204595.48831: done dumping result, returning 46400 1727204595.48844: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-0000000019d2] 46400 1727204595.48855: sending task result for task 0affcd87-79f5-1303-fda8-0000000019d2 46400 1727204595.48986: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019d2 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b skipped because already active" ] } } 46400 1727204595.49092: no more pending results, returning what we have 46400 1727204595.49096: results queue empty 46400 1727204595.49097: checking for any_errors_fatal 46400 1727204595.49106: done checking for any_errors_fatal 46400 1727204595.49107: checking for max_fail_percentage 46400 1727204595.49109: done checking for max_fail_percentage 46400 1727204595.49111: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.49112: done checking to see if all hosts have failed 46400 1727204595.49112: getting the remaining hosts for this loop 46400 1727204595.49114: done getting the remaining hosts for this loop 46400 1727204595.49119: getting the next task for host managed-node2 46400 1727204595.49129: done getting next task for host managed-node2 46400 1727204595.49133: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204595.49139: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.49155: getting variables 46400 1727204595.49157: in VariableManager get_vars() 46400 1727204595.49203: Calling all_inventory to load vars for managed-node2 46400 1727204595.49207: Calling groups_inventory to load vars for managed-node2 46400 1727204595.49210: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.49229: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.49232: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.49235: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.50246: WORKER PROCESS EXITING 46400 1727204595.51143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.52910: done with get_vars() 46400 1727204595.52946: done getting variables 46400 1727204595.53013: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.070) 0:01:25.815 ***** 46400 1727204595.53056: entering _queue_task() for managed-node2/debug 46400 1727204595.53424: worker is 1 (out of 1 available) 46400 1727204595.53438: exiting _queue_task() for managed-node2/debug 46400 1727204595.53451: done queuing things up, now waiting for results queue to drain 46400 1727204595.53453: waiting for pending results... 46400 1727204595.53781: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204595.53962: in run() - task 0affcd87-79f5-1303-fda8-0000000019d3 46400 1727204595.53984: variable 'ansible_search_path' from source: unknown 46400 1727204595.53991: variable 'ansible_search_path' from source: unknown 46400 1727204595.54043: calling self._execute() 46400 1727204595.54158: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.54171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.54184: variable 'omit' from source: magic vars 46400 1727204595.54613: variable 'ansible_distribution_major_version' from source: facts 46400 1727204595.54629: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204595.54773: variable 'network_state' from source: role '' defaults 46400 1727204595.54797: Evaluated conditional (network_state != {}): False 46400 1727204595.54808: when evaluation is False, skipping this task 46400 1727204595.54815: _execute() done 46400 1727204595.54822: dumping result to json 46400 1727204595.54829: done dumping result, returning 46400 1727204595.54843: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-0000000019d3] 46400 1727204595.54855: sending task result for task 0affcd87-79f5-1303-fda8-0000000019d3 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204595.55035: no more pending results, returning what we have 46400 1727204595.55040: results queue empty 46400 1727204595.55041: checking for any_errors_fatal 46400 1727204595.55051: done checking for any_errors_fatal 46400 1727204595.55052: checking for max_fail_percentage 46400 1727204595.55054: done checking for max_fail_percentage 46400 1727204595.55055: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.55056: done checking to see if all hosts have failed 46400 1727204595.55056: getting the remaining hosts for this loop 46400 1727204595.55058: done getting the remaining hosts for this loop 46400 1727204595.55062: getting the next task for host managed-node2 46400 1727204595.55072: done getting next task for host managed-node2 46400 1727204595.55078: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204595.55083: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.55111: getting variables 46400 1727204595.55113: in VariableManager get_vars() 46400 1727204595.55155: Calling all_inventory to load vars for managed-node2 46400 1727204595.55158: Calling groups_inventory to load vars for managed-node2 46400 1727204595.55160: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.55174: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.55177: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.55180: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.56248: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019d3 46400 1727204595.56252: WORKER PROCESS EXITING 46400 1727204595.57280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.59111: done with get_vars() 46400 1727204595.59142: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:15 -0400 (0:00:00.062) 0:01:25.877 ***** 46400 1727204595.59256: entering _queue_task() for managed-node2/ping 46400 1727204595.59641: worker is 1 (out of 1 available) 46400 1727204595.59652: exiting _queue_task() for managed-node2/ping 46400 1727204595.59667: done queuing things up, now waiting for results queue to drain 46400 1727204595.59669: waiting for pending results... 46400 1727204595.59990: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204595.60145: in run() - task 0affcd87-79f5-1303-fda8-0000000019d4 46400 1727204595.60176: variable 'ansible_search_path' from source: unknown 46400 1727204595.60184: variable 'ansible_search_path' from source: unknown 46400 1727204595.60228: calling self._execute() 46400 1727204595.60342: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.60354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.60374: variable 'omit' from source: magic vars 46400 1727204595.60798: variable 'ansible_distribution_major_version' from source: facts 46400 1727204595.60826: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204595.60838: variable 'omit' from source: magic vars 46400 1727204595.60931: variable 'omit' from source: magic vars 46400 1727204595.60972: variable 'omit' from source: magic vars 46400 1727204595.61030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204595.61077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204595.61111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204595.61147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.61168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204595.61209: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204595.61218: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.61226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.61339: Set connection var ansible_shell_type to sh 46400 1727204595.61366: Set connection var ansible_shell_executable to /bin/sh 46400 1727204595.61381: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204595.61393: Set connection var ansible_connection to ssh 46400 1727204595.61404: Set connection var ansible_pipelining to False 46400 1727204595.61420: Set connection var ansible_timeout to 10 46400 1727204595.61452: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.61472: variable 'ansible_connection' from source: unknown 46400 1727204595.61481: variable 'ansible_module_compression' from source: unknown 46400 1727204595.61489: variable 'ansible_shell_type' from source: unknown 46400 1727204595.61496: variable 'ansible_shell_executable' from source: unknown 46400 1727204595.61503: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204595.61510: variable 'ansible_pipelining' from source: unknown 46400 1727204595.61517: variable 'ansible_timeout' from source: unknown 46400 1727204595.61530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204595.61766: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204595.61785: variable 'omit' from source: magic vars 46400 1727204595.61801: starting attempt loop 46400 1727204595.61807: running the handler 46400 1727204595.61825: _low_level_execute_command(): starting 46400 1727204595.61836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204595.62674: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.62693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.62708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.62728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.62774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.62793: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.62807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.62824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.62839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.62850: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.62862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.62878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.62898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.62914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.62924: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.62938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.63027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.63051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.63072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.63148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.64787: stdout chunk (state=3): >>>/root <<< 46400 1727204595.64983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.64986: stdout chunk (state=3): >>><<< 46400 1727204595.64988: stderr chunk (state=3): >>><<< 46400 1727204595.65106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204595.65110: _low_level_execute_command(): starting 46400 1727204595.65113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571 `" && echo ansible-tmp-1727204595.6501007-52473-155194521511571="` echo /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571 `" ) && sleep 0' 46400 1727204595.66012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.66016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.66053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204595.66057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.66059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.66138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.66152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.66224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.68084: stdout chunk (state=3): >>>ansible-tmp-1727204595.6501007-52473-155194521511571=/root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571 <<< 46400 1727204595.68288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.68292: stdout chunk (state=3): >>><<< 46400 1727204595.68294: stderr chunk (state=3): >>><<< 46400 1727204595.68375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204595.6501007-52473-155194521511571=/root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204595.68379: variable 'ansible_module_compression' from source: unknown 46400 1727204595.68496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204595.68499: variable 'ansible_facts' from source: unknown 46400 1727204595.68542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/AnsiballZ_ping.py 46400 1727204595.68710: Sending initial data 46400 1727204595.68713: Sent initial data (153 bytes) 46400 1727204595.69725: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.69740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.69756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.69782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.69830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.69843: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.69859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.69880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.69894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.69906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.69927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.69942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.69959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.69975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.69987: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.70001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.70084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.70107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.70125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.70200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.71915: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204595.71940: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204595.71987: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpqd_rajwx /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/AnsiballZ_ping.py <<< 46400 1727204595.72019: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204595.73052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.73351: stderr chunk (state=3): >>><<< 46400 1727204595.73354: stdout chunk (state=3): >>><<< 46400 1727204595.73357: done transferring module to remote 46400 1727204595.73359: _low_level_execute_command(): starting 46400 1727204595.73361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/ /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/AnsiballZ_ping.py && sleep 0' 46400 1727204595.74236: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.74239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.74275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204595.74282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.74284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.74287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.74354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.74370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.74436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.76221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.76224: stdout chunk (state=3): >>><<< 46400 1727204595.76227: stderr chunk (state=3): >>><<< 46400 1727204595.76331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204595.76337: _low_level_execute_command(): starting 46400 1727204595.76340: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/AnsiballZ_ping.py && sleep 0' 46400 1727204595.76969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.76992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.77011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.77030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.77074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.77088: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.77114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.77134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.77147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.77159: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.77174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.77189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.77211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.77228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.77241: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.77255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.77341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.77359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.77377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.77454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.90289: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204595.91287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204595.91339: stderr chunk (state=3): >>><<< 46400 1727204595.91343: stdout chunk (state=3): >>><<< 46400 1727204595.91361: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204595.91391: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204595.91401: _low_level_execute_command(): starting 46400 1727204595.91407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204595.6501007-52473-155194521511571/ > /dev/null 2>&1 && sleep 0' 46400 1727204595.92075: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204595.92084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.92095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.92109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.92149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.92155: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204595.92171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.92184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204595.92192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204595.92198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204595.92207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204595.92217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204595.92227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204595.92233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204595.92239: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204595.92248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204595.92342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204595.92350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204595.92352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204595.92419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204595.94219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204595.94304: stderr chunk (state=3): >>><<< 46400 1727204595.94308: stdout chunk (state=3): >>><<< 46400 1727204595.94325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204595.94332: handler run complete 46400 1727204595.94350: attempt loop complete, returning result 46400 1727204595.94354: _execute() done 46400 1727204595.94356: dumping result to json 46400 1727204595.94358: done dumping result, returning 46400 1727204595.94376: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-0000000019d4] 46400 1727204595.94379: sending task result for task 0affcd87-79f5-1303-fda8-0000000019d4 46400 1727204595.94481: done sending task result for task 0affcd87-79f5-1303-fda8-0000000019d4 46400 1727204595.94484: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204595.94547: no more pending results, returning what we have 46400 1727204595.94551: results queue empty 46400 1727204595.94552: checking for any_errors_fatal 46400 1727204595.94560: done checking for any_errors_fatal 46400 1727204595.94560: checking for max_fail_percentage 46400 1727204595.94562: done checking for max_fail_percentage 46400 1727204595.94565: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.94566: done checking to see if all hosts have failed 46400 1727204595.94567: getting the remaining hosts for this loop 46400 1727204595.94569: done getting the remaining hosts for this loop 46400 1727204595.94572: getting the next task for host managed-node2 46400 1727204595.94584: done getting next task for host managed-node2 46400 1727204595.94586: ^ task is: TASK: meta (role_complete) 46400 1727204595.94591: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.94604: getting variables 46400 1727204595.94605: in VariableManager get_vars() 46400 1727204595.94650: Calling all_inventory to load vars for managed-node2 46400 1727204595.94653: Calling groups_inventory to load vars for managed-node2 46400 1727204595.94655: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.94667: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.94669: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.94672: Calling groups_plugins_play to load vars for managed-node2 46400 1727204595.96205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204595.98524: done with get_vars() 46400 1727204595.98553: done getting variables 46400 1727204595.98818: done queuing things up, now waiting for results queue to drain 46400 1727204595.98899: results queue empty 46400 1727204595.98900: checking for any_errors_fatal 46400 1727204595.98903: done checking for any_errors_fatal 46400 1727204595.98904: checking for max_fail_percentage 46400 1727204595.98905: done checking for max_fail_percentage 46400 1727204595.98906: checking to see if all hosts have failed and the running result is not ok 46400 1727204595.98906: done checking to see if all hosts have failed 46400 1727204595.98907: getting the remaining hosts for this loop 46400 1727204595.98908: done getting the remaining hosts for this loop 46400 1727204595.98911: getting the next task for host managed-node2 46400 1727204595.98923: done getting next task for host managed-node2 46400 1727204595.98926: ^ task is: TASK: Include network role 46400 1727204595.98933: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204595.98941: getting variables 46400 1727204595.98942: in VariableManager get_vars() 46400 1727204595.98956: Calling all_inventory to load vars for managed-node2 46400 1727204595.98959: Calling groups_inventory to load vars for managed-node2 46400 1727204595.98961: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204595.98969: Calling all_plugins_play to load vars for managed-node2 46400 1727204595.98972: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204595.98975: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.01429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.04722: done with get_vars() 46400 1727204596.04758: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.456) 0:01:26.333 ***** 46400 1727204596.04862: entering _queue_task() for managed-node2/include_role 46400 1727204596.05252: worker is 1 (out of 1 available) 46400 1727204596.05267: exiting _queue_task() for managed-node2/include_role 46400 1727204596.05281: done queuing things up, now waiting for results queue to drain 46400 1727204596.05283: waiting for pending results... 46400 1727204596.05616: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204596.05793: in run() - task 0affcd87-79f5-1303-fda8-0000000017d9 46400 1727204596.05812: variable 'ansible_search_path' from source: unknown 46400 1727204596.05819: variable 'ansible_search_path' from source: unknown 46400 1727204596.05874: calling self._execute() 46400 1727204596.05993: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.06005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.06020: variable 'omit' from source: magic vars 46400 1727204596.06452: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.06474: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.06487: _execute() done 46400 1727204596.06499: dumping result to json 46400 1727204596.06509: done dumping result, returning 46400 1727204596.06525: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-0000000017d9] 46400 1727204596.06539: sending task result for task 0affcd87-79f5-1303-fda8-0000000017d9 46400 1727204596.06701: no more pending results, returning what we have 46400 1727204596.06706: in VariableManager get_vars() 46400 1727204596.06756: Calling all_inventory to load vars for managed-node2 46400 1727204596.06759: Calling groups_inventory to load vars for managed-node2 46400 1727204596.06763: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.06779: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.06782: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.06785: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.07967: done sending task result for task 0affcd87-79f5-1303-fda8-0000000017d9 46400 1727204596.07971: WORKER PROCESS EXITING 46400 1727204596.08931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.10732: done with get_vars() 46400 1727204596.10757: variable 'ansible_search_path' from source: unknown 46400 1727204596.10758: variable 'ansible_search_path' from source: unknown 46400 1727204596.10935: variable 'omit' from source: magic vars 46400 1727204596.10983: variable 'omit' from source: magic vars 46400 1727204596.11005: variable 'omit' from source: magic vars 46400 1727204596.11009: we have included files to process 46400 1727204596.11016: generating all_blocks data 46400 1727204596.11018: done generating all_blocks data 46400 1727204596.11023: processing included file: fedora.linux_system_roles.network 46400 1727204596.11046: in VariableManager get_vars() 46400 1727204596.11062: done with get_vars() 46400 1727204596.11094: in VariableManager get_vars() 46400 1727204596.11117: done with get_vars() 46400 1727204596.11162: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204596.11304: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204596.11401: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204596.11917: in VariableManager get_vars() 46400 1727204596.11938: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204596.14149: iterating over new_blocks loaded from include file 46400 1727204596.14152: in VariableManager get_vars() 46400 1727204596.14181: done with get_vars() 46400 1727204596.14184: filtering new block on tags 46400 1727204596.14514: done filtering new block on tags 46400 1727204596.14518: in VariableManager get_vars() 46400 1727204596.14535: done with get_vars() 46400 1727204596.14536: filtering new block on tags 46400 1727204596.14553: done filtering new block on tags 46400 1727204596.14555: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204596.14561: extending task lists for all hosts with included blocks 46400 1727204596.14689: done extending task lists 46400 1727204596.14691: done processing included files 46400 1727204596.14691: results queue empty 46400 1727204596.14692: checking for any_errors_fatal 46400 1727204596.14694: done checking for any_errors_fatal 46400 1727204596.14695: checking for max_fail_percentage 46400 1727204596.14696: done checking for max_fail_percentage 46400 1727204596.14697: checking to see if all hosts have failed and the running result is not ok 46400 1727204596.14697: done checking to see if all hosts have failed 46400 1727204596.14698: getting the remaining hosts for this loop 46400 1727204596.14699: done getting the remaining hosts for this loop 46400 1727204596.14702: getting the next task for host managed-node2 46400 1727204596.14706: done getting next task for host managed-node2 46400 1727204596.14714: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204596.14717: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204596.14734: getting variables 46400 1727204596.14735: in VariableManager get_vars() 46400 1727204596.14749: Calling all_inventory to load vars for managed-node2 46400 1727204596.14751: Calling groups_inventory to load vars for managed-node2 46400 1727204596.14753: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.14758: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.14760: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.14763: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.16278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.18087: done with get_vars() 46400 1727204596.18119: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.133) 0:01:26.466 ***** 46400 1727204596.18212: entering _queue_task() for managed-node2/include_tasks 46400 1727204596.18597: worker is 1 (out of 1 available) 46400 1727204596.18610: exiting _queue_task() for managed-node2/include_tasks 46400 1727204596.18623: done queuing things up, now waiting for results queue to drain 46400 1727204596.18629: waiting for pending results... 46400 1727204596.18944: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204596.19114: in run() - task 0affcd87-79f5-1303-fda8-000000001b3b 46400 1727204596.19135: variable 'ansible_search_path' from source: unknown 46400 1727204596.19142: variable 'ansible_search_path' from source: unknown 46400 1727204596.19192: calling self._execute() 46400 1727204596.19303: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.19315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.19327: variable 'omit' from source: magic vars 46400 1727204596.19742: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.19759: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.19773: _execute() done 46400 1727204596.19780: dumping result to json 46400 1727204596.19787: done dumping result, returning 46400 1727204596.19796: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000001b3b] 46400 1727204596.19805: sending task result for task 0affcd87-79f5-1303-fda8-000000001b3b 46400 1727204596.19933: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b3b 46400 1727204596.19992: no more pending results, returning what we have 46400 1727204596.19997: in VariableManager get_vars() 46400 1727204596.20054: Calling all_inventory to load vars for managed-node2 46400 1727204596.20057: Calling groups_inventory to load vars for managed-node2 46400 1727204596.20059: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.20074: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.20077: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.20080: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.21115: WORKER PROCESS EXITING 46400 1727204596.21965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.23256: done with get_vars() 46400 1727204596.23279: variable 'ansible_search_path' from source: unknown 46400 1727204596.23280: variable 'ansible_search_path' from source: unknown 46400 1727204596.23309: we have included files to process 46400 1727204596.23310: generating all_blocks data 46400 1727204596.23311: done generating all_blocks data 46400 1727204596.23315: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204596.23315: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204596.23317: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204596.23735: done processing included file 46400 1727204596.23737: iterating over new_blocks loaded from include file 46400 1727204596.23738: in VariableManager get_vars() 46400 1727204596.23758: done with get_vars() 46400 1727204596.23761: filtering new block on tags 46400 1727204596.23786: done filtering new block on tags 46400 1727204596.23788: in VariableManager get_vars() 46400 1727204596.23803: done with get_vars() 46400 1727204596.23804: filtering new block on tags 46400 1727204596.23830: done filtering new block on tags 46400 1727204596.23832: in VariableManager get_vars() 46400 1727204596.23845: done with get_vars() 46400 1727204596.23846: filtering new block on tags 46400 1727204596.23878: done filtering new block on tags 46400 1727204596.23880: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204596.23884: extending task lists for all hosts with included blocks 46400 1727204596.25415: done extending task lists 46400 1727204596.25416: done processing included files 46400 1727204596.25417: results queue empty 46400 1727204596.25418: checking for any_errors_fatal 46400 1727204596.25421: done checking for any_errors_fatal 46400 1727204596.25422: checking for max_fail_percentage 46400 1727204596.25423: done checking for max_fail_percentage 46400 1727204596.25424: checking to see if all hosts have failed and the running result is not ok 46400 1727204596.25425: done checking to see if all hosts have failed 46400 1727204596.25426: getting the remaining hosts for this loop 46400 1727204596.25427: done getting the remaining hosts for this loop 46400 1727204596.25429: getting the next task for host managed-node2 46400 1727204596.25434: done getting next task for host managed-node2 46400 1727204596.25437: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204596.25441: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204596.25453: getting variables 46400 1727204596.25454: in VariableManager get_vars() 46400 1727204596.25471: Calling all_inventory to load vars for managed-node2 46400 1727204596.25473: Calling groups_inventory to load vars for managed-node2 46400 1727204596.25480: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.25492: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.25495: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.25498: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.26813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.28546: done with get_vars() 46400 1727204596.28575: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.104) 0:01:26.570 ***** 46400 1727204596.28637: entering _queue_task() for managed-node2/setup 46400 1727204596.28899: worker is 1 (out of 1 available) 46400 1727204596.28913: exiting _queue_task() for managed-node2/setup 46400 1727204596.28927: done queuing things up, now waiting for results queue to drain 46400 1727204596.28930: waiting for pending results... 46400 1727204596.29133: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204596.29252: in run() - task 0affcd87-79f5-1303-fda8-000000001b92 46400 1727204596.29270: variable 'ansible_search_path' from source: unknown 46400 1727204596.29274: variable 'ansible_search_path' from source: unknown 46400 1727204596.29303: calling self._execute() 46400 1727204596.29383: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.29386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.29395: variable 'omit' from source: magic vars 46400 1727204596.29687: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.29698: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.29854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204596.31981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204596.32038: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204596.32068: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204596.32095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204596.32115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204596.32177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204596.32196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204596.32214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204596.32242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204596.32256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204596.32295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204596.32311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204596.32327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204596.32355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204596.32370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204596.32483: variable '__network_required_facts' from source: role '' defaults 46400 1727204596.32487: variable 'ansible_facts' from source: unknown 46400 1727204596.32966: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204596.32970: when evaluation is False, skipping this task 46400 1727204596.32972: _execute() done 46400 1727204596.32975: dumping result to json 46400 1727204596.32977: done dumping result, returning 46400 1727204596.32980: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000001b92] 46400 1727204596.32986: sending task result for task 0affcd87-79f5-1303-fda8-000000001b92 46400 1727204596.33087: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b92 46400 1727204596.33090: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204596.33141: no more pending results, returning what we have 46400 1727204596.33145: results queue empty 46400 1727204596.33146: checking for any_errors_fatal 46400 1727204596.33148: done checking for any_errors_fatal 46400 1727204596.33148: checking for max_fail_percentage 46400 1727204596.33150: done checking for max_fail_percentage 46400 1727204596.33151: checking to see if all hosts have failed and the running result is not ok 46400 1727204596.33152: done checking to see if all hosts have failed 46400 1727204596.33152: getting the remaining hosts for this loop 46400 1727204596.33154: done getting the remaining hosts for this loop 46400 1727204596.33158: getting the next task for host managed-node2 46400 1727204596.33173: done getting next task for host managed-node2 46400 1727204596.33178: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204596.33184: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204596.33210: getting variables 46400 1727204596.33211: in VariableManager get_vars() 46400 1727204596.33253: Calling all_inventory to load vars for managed-node2 46400 1727204596.33255: Calling groups_inventory to load vars for managed-node2 46400 1727204596.33257: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.33271: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.33274: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.33282: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.34132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.35224: done with get_vars() 46400 1727204596.35242: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.066) 0:01:26.637 ***** 46400 1727204596.35325: entering _queue_task() for managed-node2/stat 46400 1727204596.35597: worker is 1 (out of 1 available) 46400 1727204596.35610: exiting _queue_task() for managed-node2/stat 46400 1727204596.35624: done queuing things up, now waiting for results queue to drain 46400 1727204596.35626: waiting for pending results... 46400 1727204596.35827: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204596.35934: in run() - task 0affcd87-79f5-1303-fda8-000000001b94 46400 1727204596.35951: variable 'ansible_search_path' from source: unknown 46400 1727204596.35955: variable 'ansible_search_path' from source: unknown 46400 1727204596.35987: calling self._execute() 46400 1727204596.36072: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.36078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.36082: variable 'omit' from source: magic vars 46400 1727204596.36373: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.36384: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.36511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204596.36708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204596.36741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204596.36768: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204596.36794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204596.36861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204596.36883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204596.36902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204596.36921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204596.37007: variable '__network_is_ostree' from source: set_fact 46400 1727204596.37011: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204596.37014: when evaluation is False, skipping this task 46400 1727204596.37016: _execute() done 46400 1727204596.37020: dumping result to json 46400 1727204596.37022: done dumping result, returning 46400 1727204596.37029: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000001b94] 46400 1727204596.37036: sending task result for task 0affcd87-79f5-1303-fda8-000000001b94 46400 1727204596.37125: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b94 46400 1727204596.37128: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204596.37186: no more pending results, returning what we have 46400 1727204596.37190: results queue empty 46400 1727204596.37191: checking for any_errors_fatal 46400 1727204596.37200: done checking for any_errors_fatal 46400 1727204596.37201: checking for max_fail_percentage 46400 1727204596.37203: done checking for max_fail_percentage 46400 1727204596.37204: checking to see if all hosts have failed and the running result is not ok 46400 1727204596.37205: done checking to see if all hosts have failed 46400 1727204596.37205: getting the remaining hosts for this loop 46400 1727204596.37207: done getting the remaining hosts for this loop 46400 1727204596.37211: getting the next task for host managed-node2 46400 1727204596.37220: done getting next task for host managed-node2 46400 1727204596.37224: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204596.37230: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204596.37265: getting variables 46400 1727204596.37267: in VariableManager get_vars() 46400 1727204596.37308: Calling all_inventory to load vars for managed-node2 46400 1727204596.37311: Calling groups_inventory to load vars for managed-node2 46400 1727204596.37313: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.37322: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.37325: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.37327: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.38160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.39090: done with get_vars() 46400 1727204596.39109: done getting variables 46400 1727204596.39153: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.038) 0:01:26.676 ***** 46400 1727204596.39184: entering _queue_task() for managed-node2/set_fact 46400 1727204596.39423: worker is 1 (out of 1 available) 46400 1727204596.39436: exiting _queue_task() for managed-node2/set_fact 46400 1727204596.39450: done queuing things up, now waiting for results queue to drain 46400 1727204596.39452: waiting for pending results... 46400 1727204596.39650: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204596.39748: in run() - task 0affcd87-79f5-1303-fda8-000000001b95 46400 1727204596.39771: variable 'ansible_search_path' from source: unknown 46400 1727204596.39779: variable 'ansible_search_path' from source: unknown 46400 1727204596.39800: calling self._execute() 46400 1727204596.39881: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.39885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.39894: variable 'omit' from source: magic vars 46400 1727204596.40176: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.40186: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.40307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204596.40509: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204596.40545: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204596.40575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204596.40600: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204596.40668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204596.40687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204596.40705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204596.40722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204596.40792: variable '__network_is_ostree' from source: set_fact 46400 1727204596.40797: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204596.40801: when evaluation is False, skipping this task 46400 1727204596.40803: _execute() done 46400 1727204596.40806: dumping result to json 46400 1727204596.40808: done dumping result, returning 46400 1727204596.40815: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000001b95] 46400 1727204596.40820: sending task result for task 0affcd87-79f5-1303-fda8-000000001b95 46400 1727204596.40907: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b95 46400 1727204596.40910: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204596.40958: no more pending results, returning what we have 46400 1727204596.40961: results queue empty 46400 1727204596.40963: checking for any_errors_fatal 46400 1727204596.40975: done checking for any_errors_fatal 46400 1727204596.40976: checking for max_fail_percentage 46400 1727204596.40978: done checking for max_fail_percentage 46400 1727204596.40979: checking to see if all hosts have failed and the running result is not ok 46400 1727204596.40980: done checking to see if all hosts have failed 46400 1727204596.40980: getting the remaining hosts for this loop 46400 1727204596.40982: done getting the remaining hosts for this loop 46400 1727204596.40986: getting the next task for host managed-node2 46400 1727204596.40997: done getting next task for host managed-node2 46400 1727204596.41001: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204596.41006: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204596.41035: getting variables 46400 1727204596.41037: in VariableManager get_vars() 46400 1727204596.41075: Calling all_inventory to load vars for managed-node2 46400 1727204596.41078: Calling groups_inventory to load vars for managed-node2 46400 1727204596.41080: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204596.41089: Calling all_plugins_play to load vars for managed-node2 46400 1727204596.41091: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204596.41094: Calling groups_plugins_play to load vars for managed-node2 46400 1727204596.42053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204596.47031: done with get_vars() 46400 1727204596.47052: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:16 -0400 (0:00:00.079) 0:01:26.755 ***** 46400 1727204596.47121: entering _queue_task() for managed-node2/service_facts 46400 1727204596.47377: worker is 1 (out of 1 available) 46400 1727204596.47391: exiting _queue_task() for managed-node2/service_facts 46400 1727204596.47405: done queuing things up, now waiting for results queue to drain 46400 1727204596.47407: waiting for pending results... 46400 1727204596.47604: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204596.47719: in run() - task 0affcd87-79f5-1303-fda8-000000001b97 46400 1727204596.47730: variable 'ansible_search_path' from source: unknown 46400 1727204596.47734: variable 'ansible_search_path' from source: unknown 46400 1727204596.47770: calling self._execute() 46400 1727204596.47843: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.47848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.47855: variable 'omit' from source: magic vars 46400 1727204596.48149: variable 'ansible_distribution_major_version' from source: facts 46400 1727204596.48161: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204596.48166: variable 'omit' from source: magic vars 46400 1727204596.48219: variable 'omit' from source: magic vars 46400 1727204596.48243: variable 'omit' from source: magic vars 46400 1727204596.48278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204596.48313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204596.48326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204596.48340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204596.48349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204596.48374: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204596.48377: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.48380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.48449: Set connection var ansible_shell_type to sh 46400 1727204596.48457: Set connection var ansible_shell_executable to /bin/sh 46400 1727204596.48463: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204596.48468: Set connection var ansible_connection to ssh 46400 1727204596.48474: Set connection var ansible_pipelining to False 46400 1727204596.48479: Set connection var ansible_timeout to 10 46400 1727204596.48500: variable 'ansible_shell_executable' from source: unknown 46400 1727204596.48503: variable 'ansible_connection' from source: unknown 46400 1727204596.48506: variable 'ansible_module_compression' from source: unknown 46400 1727204596.48509: variable 'ansible_shell_type' from source: unknown 46400 1727204596.48514: variable 'ansible_shell_executable' from source: unknown 46400 1727204596.48516: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204596.48519: variable 'ansible_pipelining' from source: unknown 46400 1727204596.48523: variable 'ansible_timeout' from source: unknown 46400 1727204596.48525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204596.48667: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204596.48674: variable 'omit' from source: magic vars 46400 1727204596.48679: starting attempt loop 46400 1727204596.48682: running the handler 46400 1727204596.48693: _low_level_execute_command(): starting 46400 1727204596.48698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204596.49226: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204596.49245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.49258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.49278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.49327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204596.49340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204596.49398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204596.51044: stdout chunk (state=3): >>>/root <<< 46400 1727204596.51148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204596.51210: stderr chunk (state=3): >>><<< 46400 1727204596.51214: stdout chunk (state=3): >>><<< 46400 1727204596.51240: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204596.51251: _low_level_execute_command(): starting 46400 1727204596.51257: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641 `" && echo ansible-tmp-1727204596.512353-52502-62954483421641="` echo /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641 `" ) && sleep 0' 46400 1727204596.51738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.51751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.51773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.51797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.51840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204596.51852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204596.51906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204596.53763: stdout chunk (state=3): >>>ansible-tmp-1727204596.512353-52502-62954483421641=/root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641 <<< 46400 1727204596.53875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204596.53933: stderr chunk (state=3): >>><<< 46400 1727204596.53938: stdout chunk (state=3): >>><<< 46400 1727204596.53966: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204596.512353-52502-62954483421641=/root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204596.54005: variable 'ansible_module_compression' from source: unknown 46400 1727204596.54043: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204596.54077: variable 'ansible_facts' from source: unknown 46400 1727204596.54134: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/AnsiballZ_service_facts.py 46400 1727204596.54249: Sending initial data 46400 1727204596.54255: Sent initial data (160 bytes) 46400 1727204596.54973: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204596.54982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.55012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204596.55016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.55020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.55077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204596.55080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204596.55127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204596.56839: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204596.56878: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204596.56910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp1jpvghz6 /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/AnsiballZ_service_facts.py <<< 46400 1727204596.56946: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204596.57744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204596.57855: stderr chunk (state=3): >>><<< 46400 1727204596.57858: stdout chunk (state=3): >>><<< 46400 1727204596.57886: done transferring module to remote 46400 1727204596.57898: _low_level_execute_command(): starting 46400 1727204596.57925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/ /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/AnsiballZ_service_facts.py && sleep 0' 46400 1727204596.58586: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204596.58600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.58620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204596.58639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.58686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204596.58698: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204596.58712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.58735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204596.58748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204596.58761: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204596.58778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.58793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204596.58809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.58822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204596.58835: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204596.58856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.58933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204596.58961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204596.58983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204596.59058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204596.60748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204596.60805: stderr chunk (state=3): >>><<< 46400 1727204596.60809: stdout chunk (state=3): >>><<< 46400 1727204596.60824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204596.60827: _low_level_execute_command(): starting 46400 1727204596.60830: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/AnsiballZ_service_facts.py && sleep 0' 46400 1727204596.61284: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.61288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204596.61344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204596.61347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.61350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204596.61352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204596.61354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204596.61405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204596.61411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204596.61470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204597.91108: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 46400 1727204597.91115: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204597.91126: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 46400 1727204597.91129: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 46400 1727204597.91170: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204597.92514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204597.92518: stdout chunk (state=3): >>><<< 46400 1727204597.92520: stderr chunk (state=3): >>><<< 46400 1727204597.92785: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204597.93123: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204597.93137: _low_level_execute_command(): starting 46400 1727204597.93140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204596.512353-52502-62954483421641/ > /dev/null 2>&1 && sleep 0' 46400 1727204597.93581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204597.93594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204597.93605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204597.93616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204597.93671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204597.93687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204597.93730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204597.95583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204597.95586: stdout chunk (state=3): >>><<< 46400 1727204597.95593: stderr chunk (state=3): >>><<< 46400 1727204597.95608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204597.95614: handler run complete 46400 1727204597.95804: variable 'ansible_facts' from source: unknown 46400 1727204597.95957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204597.97073: variable 'ansible_facts' from source: unknown 46400 1727204597.97086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204597.97261: attempt loop complete, returning result 46400 1727204597.97272: _execute() done 46400 1727204597.97275: dumping result to json 46400 1727204597.97321: done dumping result, returning 46400 1727204597.97330: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000001b97] 46400 1727204597.97335: sending task result for task 0affcd87-79f5-1303-fda8-000000001b97 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204597.98101: no more pending results, returning what we have 46400 1727204597.98105: results queue empty 46400 1727204597.98106: checking for any_errors_fatal 46400 1727204597.98113: done checking for any_errors_fatal 46400 1727204597.98114: checking for max_fail_percentage 46400 1727204597.98116: done checking for max_fail_percentage 46400 1727204597.98117: checking to see if all hosts have failed and the running result is not ok 46400 1727204597.98117: done checking to see if all hosts have failed 46400 1727204597.98118: getting the remaining hosts for this loop 46400 1727204597.98120: done getting the remaining hosts for this loop 46400 1727204597.98123: getting the next task for host managed-node2 46400 1727204597.98131: done getting next task for host managed-node2 46400 1727204597.98135: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204597.98142: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204597.98156: getting variables 46400 1727204597.98158: in VariableManager get_vars() 46400 1727204597.98194: Calling all_inventory to load vars for managed-node2 46400 1727204597.98197: Calling groups_inventory to load vars for managed-node2 46400 1727204597.98200: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204597.98210: Calling all_plugins_play to load vars for managed-node2 46400 1727204597.98213: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204597.98216: Calling groups_plugins_play to load vars for managed-node2 46400 1727204597.98931: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b97 46400 1727204597.98934: WORKER PROCESS EXITING 46400 1727204598.00616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.02598: done with get_vars() 46400 1727204598.02630: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:18 -0400 (0:00:01.556) 0:01:28.312 ***** 46400 1727204598.02751: entering _queue_task() for managed-node2/package_facts 46400 1727204598.03168: worker is 1 (out of 1 available) 46400 1727204598.03181: exiting _queue_task() for managed-node2/package_facts 46400 1727204598.03199: done queuing things up, now waiting for results queue to drain 46400 1727204598.03200: waiting for pending results... 46400 1727204598.03569: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204598.03783: in run() - task 0affcd87-79f5-1303-fda8-000000001b98 46400 1727204598.03804: variable 'ansible_search_path' from source: unknown 46400 1727204598.03811: variable 'ansible_search_path' from source: unknown 46400 1727204598.03856: calling self._execute() 46400 1727204598.03994: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.04027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.04043: variable 'omit' from source: magic vars 46400 1727204598.04491: variable 'ansible_distribution_major_version' from source: facts 46400 1727204598.04514: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204598.04531: variable 'omit' from source: magic vars 46400 1727204598.04626: variable 'omit' from source: magic vars 46400 1727204598.04675: variable 'omit' from source: magic vars 46400 1727204598.04728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204598.04781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204598.04807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204598.04830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204598.04857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204598.04895: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204598.04904: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.04911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.05021: Set connection var ansible_shell_type to sh 46400 1727204598.05036: Set connection var ansible_shell_executable to /bin/sh 46400 1727204598.05047: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204598.05072: Set connection var ansible_connection to ssh 46400 1727204598.05084: Set connection var ansible_pipelining to False 46400 1727204598.05093: Set connection var ansible_timeout to 10 46400 1727204598.05122: variable 'ansible_shell_executable' from source: unknown 46400 1727204598.05130: variable 'ansible_connection' from source: unknown 46400 1727204598.05136: variable 'ansible_module_compression' from source: unknown 46400 1727204598.05142: variable 'ansible_shell_type' from source: unknown 46400 1727204598.05147: variable 'ansible_shell_executable' from source: unknown 46400 1727204598.05154: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.05172: variable 'ansible_pipelining' from source: unknown 46400 1727204598.05184: variable 'ansible_timeout' from source: unknown 46400 1727204598.05191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.05427: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204598.05445: variable 'omit' from source: magic vars 46400 1727204598.05456: starting attempt loop 46400 1727204598.05469: running the handler 46400 1727204598.05495: _low_level_execute_command(): starting 46400 1727204598.05517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204598.06441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204598.06458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.06479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.06505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.06556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.06576: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204598.06591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.06616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204598.06634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204598.06646: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204598.06658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.06678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.06694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.06707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.06720: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204598.06744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.06823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204598.06862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.06888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.06977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.08662: stdout chunk (state=3): >>>/root <<< 46400 1727204598.08740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204598.08969: stderr chunk (state=3): >>><<< 46400 1727204598.08986: stdout chunk (state=3): >>><<< 46400 1727204598.09127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204598.09131: _low_level_execute_command(): starting 46400 1727204598.09135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888 `" && echo ansible-tmp-1727204598.0902293-52552-125099236730888="` echo /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888 `" ) && sleep 0' 46400 1727204598.10081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.10085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.10114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.10117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.10120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204598.10122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.10185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.10757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.10776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.12699: stdout chunk (state=3): >>>ansible-tmp-1727204598.0902293-52552-125099236730888=/root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888 <<< 46400 1727204598.12809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204598.12922: stderr chunk (state=3): >>><<< 46400 1727204598.12926: stdout chunk (state=3): >>><<< 46400 1727204598.12929: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204598.0902293-52552-125099236730888=/root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204598.12948: variable 'ansible_module_compression' from source: unknown 46400 1727204598.13002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204598.13066: variable 'ansible_facts' from source: unknown 46400 1727204598.13252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/AnsiballZ_package_facts.py 46400 1727204598.14672: Sending initial data 46400 1727204598.14677: Sent initial data (162 bytes) 46400 1727204598.14995: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204598.14999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.15001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.15004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.15006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.15008: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204598.15010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.15012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204598.15014: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204598.15016: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204598.15018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.15020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.15022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.15024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.15026: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204598.15027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.15029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204598.15031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.15033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.15439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.16993: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204598.17018: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204598.17061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpvbsnyoxb /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/AnsiballZ_package_facts.py <<< 46400 1727204598.17098: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204598.19028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204598.19137: stderr chunk (state=3): >>><<< 46400 1727204598.19141: stdout chunk (state=3): >>><<< 46400 1727204598.19158: done transferring module to remote 46400 1727204598.19170: _low_level_execute_command(): starting 46400 1727204598.19177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/ /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/AnsiballZ_package_facts.py && sleep 0' 46400 1727204598.19771: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204598.19781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.19805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.19999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.20003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.20005: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204598.20008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.20010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204598.20013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204598.20015: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204598.20017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.20019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.20031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.20039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.20046: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204598.20055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.20131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204598.20148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.20160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.20227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.21989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204598.22009: stderr chunk (state=3): >>><<< 46400 1727204598.22012: stdout chunk (state=3): >>><<< 46400 1727204598.22028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204598.22031: _low_level_execute_command(): starting 46400 1727204598.22036: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/AnsiballZ_package_facts.py && sleep 0' 46400 1727204598.22499: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.22505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.22542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.22547: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204598.22556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.22575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204598.22581: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204598.22588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.22594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204598.22603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.22610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204598.22615: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204598.22619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.22678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204598.22699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.22702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.22758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.69705: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "rel<<< 46400 1727204598.69784: stdout chunk (state=3): >>>ease": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version<<< 46400 1727204598.69795: stdout chunk (state=3): >>>": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch"<<< 46400 1727204598.69800: stdout chunk (state=3): >>>: "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux":<<< 46400 1727204598.69804: stdout chunk (state=3): >>> [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86<<< 46400 1727204598.69880: stdout chunk (state=3): >>>_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "sourc<<< 46400 1727204598.69890: stdout chunk (state=3): >>>e": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch":<<< 46400 1727204598.69894: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "n<<< 46400 1727204598.69930: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "p<<< 46400 1727204598.69934: stdout chunk (state=3): >>>erl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils",<<< 46400 1727204598.69952: stdout chunk (state=3): >>> "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204598.71455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204598.71512: stderr chunk (state=3): >>><<< 46400 1727204598.71516: stdout chunk (state=3): >>><<< 46400 1727204598.71554: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204598.73987: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204598.74006: _low_level_execute_command(): starting 46400 1727204598.74009: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204598.0902293-52552-125099236730888/ > /dev/null 2>&1 && sleep 0' 46400 1727204598.74469: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204598.74476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204598.74506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.74519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204598.74580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204598.74585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204598.74639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204598.76475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204598.76529: stderr chunk (state=3): >>><<< 46400 1727204598.76532: stdout chunk (state=3): >>><<< 46400 1727204598.76547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204598.76552: handler run complete 46400 1727204598.77076: variable 'ansible_facts' from source: unknown 46400 1727204598.77379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.78562: variable 'ansible_facts' from source: unknown 46400 1727204598.78833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.79470: attempt loop complete, returning result 46400 1727204598.79484: _execute() done 46400 1727204598.79487: dumping result to json 46400 1727204598.79695: done dumping result, returning 46400 1727204598.79701: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000001b98] 46400 1727204598.79705: sending task result for task 0affcd87-79f5-1303-fda8-000000001b98 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204598.81752: no more pending results, returning what we have 46400 1727204598.81754: results queue empty 46400 1727204598.81755: checking for any_errors_fatal 46400 1727204598.81758: done checking for any_errors_fatal 46400 1727204598.81761: checking for max_fail_percentage 46400 1727204598.81762: done checking for max_fail_percentage 46400 1727204598.81763: checking to see if all hosts have failed and the running result is not ok 46400 1727204598.81765: done checking to see if all hosts have failed 46400 1727204598.81766: getting the remaining hosts for this loop 46400 1727204598.81766: done getting the remaining hosts for this loop 46400 1727204598.81769: getting the next task for host managed-node2 46400 1727204598.81776: done getting next task for host managed-node2 46400 1727204598.81779: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204598.81783: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204598.81792: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b98 46400 1727204598.81795: WORKER PROCESS EXITING 46400 1727204598.81802: getting variables 46400 1727204598.81804: in VariableManager get_vars() 46400 1727204598.81831: Calling all_inventory to load vars for managed-node2 46400 1727204598.81833: Calling groups_inventory to load vars for managed-node2 46400 1727204598.81834: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204598.81841: Calling all_plugins_play to load vars for managed-node2 46400 1727204598.81843: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204598.81845: Calling groups_plugins_play to load vars for managed-node2 46400 1727204598.82594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.83603: done with get_vars() 46400 1727204598.83621: done getting variables 46400 1727204598.83674: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.809) 0:01:29.121 ***** 46400 1727204598.83706: entering _queue_task() for managed-node2/debug 46400 1727204598.83954: worker is 1 (out of 1 available) 46400 1727204598.83972: exiting _queue_task() for managed-node2/debug 46400 1727204598.83984: done queuing things up, now waiting for results queue to drain 46400 1727204598.83986: waiting for pending results... 46400 1727204598.84176: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204598.84275: in run() - task 0affcd87-79f5-1303-fda8-000000001b3c 46400 1727204598.84288: variable 'ansible_search_path' from source: unknown 46400 1727204598.84292: variable 'ansible_search_path' from source: unknown 46400 1727204598.84322: calling self._execute() 46400 1727204598.84401: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.84405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.84415: variable 'omit' from source: magic vars 46400 1727204598.84699: variable 'ansible_distribution_major_version' from source: facts 46400 1727204598.84709: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204598.84715: variable 'omit' from source: magic vars 46400 1727204598.84767: variable 'omit' from source: magic vars 46400 1727204598.84836: variable 'network_provider' from source: set_fact 46400 1727204598.84851: variable 'omit' from source: magic vars 46400 1727204598.84888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204598.84915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204598.84933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204598.84946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204598.84961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204598.84983: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204598.84986: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.84989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.85051: Set connection var ansible_shell_type to sh 46400 1727204598.85063: Set connection var ansible_shell_executable to /bin/sh 46400 1727204598.85066: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204598.85071: Set connection var ansible_connection to ssh 46400 1727204598.85075: Set connection var ansible_pipelining to False 46400 1727204598.85083: Set connection var ansible_timeout to 10 46400 1727204598.85101: variable 'ansible_shell_executable' from source: unknown 46400 1727204598.85104: variable 'ansible_connection' from source: unknown 46400 1727204598.85107: variable 'ansible_module_compression' from source: unknown 46400 1727204598.85109: variable 'ansible_shell_type' from source: unknown 46400 1727204598.85111: variable 'ansible_shell_executable' from source: unknown 46400 1727204598.85113: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.85116: variable 'ansible_pipelining' from source: unknown 46400 1727204598.85120: variable 'ansible_timeout' from source: unknown 46400 1727204598.85124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.85225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204598.85234: variable 'omit' from source: magic vars 46400 1727204598.85239: starting attempt loop 46400 1727204598.85242: running the handler 46400 1727204598.85278: handler run complete 46400 1727204598.85290: attempt loop complete, returning result 46400 1727204598.85293: _execute() done 46400 1727204598.85297: dumping result to json 46400 1727204598.85299: done dumping result, returning 46400 1727204598.85307: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000001b3c] 46400 1727204598.85313: sending task result for task 0affcd87-79f5-1303-fda8-000000001b3c 46400 1727204598.85400: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b3c 46400 1727204598.85403: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204598.85488: no more pending results, returning what we have 46400 1727204598.85492: results queue empty 46400 1727204598.85493: checking for any_errors_fatal 46400 1727204598.85505: done checking for any_errors_fatal 46400 1727204598.85506: checking for max_fail_percentage 46400 1727204598.85508: done checking for max_fail_percentage 46400 1727204598.85509: checking to see if all hosts have failed and the running result is not ok 46400 1727204598.85510: done checking to see if all hosts have failed 46400 1727204598.85510: getting the remaining hosts for this loop 46400 1727204598.85512: done getting the remaining hosts for this loop 46400 1727204598.85519: getting the next task for host managed-node2 46400 1727204598.85530: done getting next task for host managed-node2 46400 1727204598.85537: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204598.85543: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204598.85555: getting variables 46400 1727204598.85556: in VariableManager get_vars() 46400 1727204598.85594: Calling all_inventory to load vars for managed-node2 46400 1727204598.85597: Calling groups_inventory to load vars for managed-node2 46400 1727204598.85599: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204598.85607: Calling all_plugins_play to load vars for managed-node2 46400 1727204598.85614: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204598.85617: Calling groups_plugins_play to load vars for managed-node2 46400 1727204598.86418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.87413: done with get_vars() 46400 1727204598.87435: done getting variables 46400 1727204598.87493: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.038) 0:01:29.159 ***** 46400 1727204598.87536: entering _queue_task() for managed-node2/fail 46400 1727204598.87846: worker is 1 (out of 1 available) 46400 1727204598.87863: exiting _queue_task() for managed-node2/fail 46400 1727204598.87878: done queuing things up, now waiting for results queue to drain 46400 1727204598.87880: waiting for pending results... 46400 1727204598.88185: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204598.88338: in run() - task 0affcd87-79f5-1303-fda8-000000001b3d 46400 1727204598.88366: variable 'ansible_search_path' from source: unknown 46400 1727204598.88376: variable 'ansible_search_path' from source: unknown 46400 1727204598.88427: calling self._execute() 46400 1727204598.88534: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.88550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.88570: variable 'omit' from source: magic vars 46400 1727204598.88958: variable 'ansible_distribution_major_version' from source: facts 46400 1727204598.88981: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204598.89074: variable 'network_state' from source: role '' defaults 46400 1727204598.89082: Evaluated conditional (network_state != {}): False 46400 1727204598.89087: when evaluation is False, skipping this task 46400 1727204598.89090: _execute() done 46400 1727204598.89092: dumping result to json 46400 1727204598.89094: done dumping result, returning 46400 1727204598.89100: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000001b3d] 46400 1727204598.89106: sending task result for task 0affcd87-79f5-1303-fda8-000000001b3d 46400 1727204598.89201: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b3d 46400 1727204598.89205: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204598.89250: no more pending results, returning what we have 46400 1727204598.89254: results queue empty 46400 1727204598.89255: checking for any_errors_fatal 46400 1727204598.89266: done checking for any_errors_fatal 46400 1727204598.89267: checking for max_fail_percentage 46400 1727204598.89268: done checking for max_fail_percentage 46400 1727204598.89269: checking to see if all hosts have failed and the running result is not ok 46400 1727204598.89270: done checking to see if all hosts have failed 46400 1727204598.89271: getting the remaining hosts for this loop 46400 1727204598.89273: done getting the remaining hosts for this loop 46400 1727204598.89277: getting the next task for host managed-node2 46400 1727204598.89285: done getting next task for host managed-node2 46400 1727204598.89290: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204598.89295: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204598.89316: getting variables 46400 1727204598.89318: in VariableManager get_vars() 46400 1727204598.89352: Calling all_inventory to load vars for managed-node2 46400 1727204598.89355: Calling groups_inventory to load vars for managed-node2 46400 1727204598.89356: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204598.89369: Calling all_plugins_play to load vars for managed-node2 46400 1727204598.89371: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204598.89373: Calling groups_plugins_play to load vars for managed-node2 46400 1727204598.90349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.91755: done with get_vars() 46400 1727204598.91781: done getting variables 46400 1727204598.91842: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.043) 0:01:29.203 ***** 46400 1727204598.91881: entering _queue_task() for managed-node2/fail 46400 1727204598.92189: worker is 1 (out of 1 available) 46400 1727204598.92201: exiting _queue_task() for managed-node2/fail 46400 1727204598.92213: done queuing things up, now waiting for results queue to drain 46400 1727204598.92215: waiting for pending results... 46400 1727204598.92531: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204598.92705: in run() - task 0affcd87-79f5-1303-fda8-000000001b3e 46400 1727204598.92727: variable 'ansible_search_path' from source: unknown 46400 1727204598.92734: variable 'ansible_search_path' from source: unknown 46400 1727204598.92782: calling self._execute() 46400 1727204598.92900: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204598.92911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204598.92925: variable 'omit' from source: magic vars 46400 1727204598.93337: variable 'ansible_distribution_major_version' from source: facts 46400 1727204598.93353: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204598.93477: variable 'network_state' from source: role '' defaults 46400 1727204598.93491: Evaluated conditional (network_state != {}): False 46400 1727204598.93498: when evaluation is False, skipping this task 46400 1727204598.93504: _execute() done 46400 1727204598.93510: dumping result to json 46400 1727204598.93517: done dumping result, returning 46400 1727204598.93531: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000001b3e] 46400 1727204598.93542: sending task result for task 0affcd87-79f5-1303-fda8-000000001b3e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204598.93687: no more pending results, returning what we have 46400 1727204598.93691: results queue empty 46400 1727204598.93692: checking for any_errors_fatal 46400 1727204598.93703: done checking for any_errors_fatal 46400 1727204598.93704: checking for max_fail_percentage 46400 1727204598.93706: done checking for max_fail_percentage 46400 1727204598.93707: checking to see if all hosts have failed and the running result is not ok 46400 1727204598.93708: done checking to see if all hosts have failed 46400 1727204598.93709: getting the remaining hosts for this loop 46400 1727204598.93711: done getting the remaining hosts for this loop 46400 1727204598.93715: getting the next task for host managed-node2 46400 1727204598.93725: done getting next task for host managed-node2 46400 1727204598.93729: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204598.93735: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204598.93770: getting variables 46400 1727204598.93772: in VariableManager get_vars() 46400 1727204598.93816: Calling all_inventory to load vars for managed-node2 46400 1727204598.93819: Calling groups_inventory to load vars for managed-node2 46400 1727204598.93821: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204598.93834: Calling all_plugins_play to load vars for managed-node2 46400 1727204598.93836: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204598.93839: Calling groups_plugins_play to load vars for managed-node2 46400 1727204598.94783: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b3e 46400 1727204598.94786: WORKER PROCESS EXITING 46400 1727204598.95584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204598.98122: done with get_vars() 46400 1727204598.98161: done getting variables 46400 1727204598.98227: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:18 -0400 (0:00:00.063) 0:01:29.267 ***** 46400 1727204598.98271: entering _queue_task() for managed-node2/fail 46400 1727204598.98618: worker is 1 (out of 1 available) 46400 1727204598.98631: exiting _queue_task() for managed-node2/fail 46400 1727204598.98643: done queuing things up, now waiting for results queue to drain 46400 1727204598.98644: waiting for pending results... 46400 1727204599.00261: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204599.00889: in run() - task 0affcd87-79f5-1303-fda8-000000001b3f 46400 1727204599.00987: variable 'ansible_search_path' from source: unknown 46400 1727204599.00991: variable 'ansible_search_path' from source: unknown 46400 1727204599.01113: calling self._execute() 46400 1727204599.01433: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.01437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.01672: variable 'omit' from source: magic vars 46400 1727204599.02642: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.02661: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.02874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204599.07120: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204599.07197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204599.07243: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204599.07287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204599.07319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204599.07408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.07448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.07485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.07531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.07556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.07675: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.07695: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204599.07702: when evaluation is False, skipping this task 46400 1727204599.07708: _execute() done 46400 1727204599.07715: dumping result to json 46400 1727204599.07721: done dumping result, returning 46400 1727204599.07734: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000001b3f] 46400 1727204599.07746: sending task result for task 0affcd87-79f5-1303-fda8-000000001b3f skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204599.07914: no more pending results, returning what we have 46400 1727204599.07919: results queue empty 46400 1727204599.07920: checking for any_errors_fatal 46400 1727204599.07925: done checking for any_errors_fatal 46400 1727204599.07926: checking for max_fail_percentage 46400 1727204599.07928: done checking for max_fail_percentage 46400 1727204599.07929: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.07930: done checking to see if all hosts have failed 46400 1727204599.07931: getting the remaining hosts for this loop 46400 1727204599.07933: done getting the remaining hosts for this loop 46400 1727204599.07937: getting the next task for host managed-node2 46400 1727204599.07948: done getting next task for host managed-node2 46400 1727204599.07953: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204599.07961: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.07993: getting variables 46400 1727204599.07995: in VariableManager get_vars() 46400 1727204599.08043: Calling all_inventory to load vars for managed-node2 46400 1727204599.08047: Calling groups_inventory to load vars for managed-node2 46400 1727204599.08049: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.08065: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.08069: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.08072: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.09081: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b3f 46400 1727204599.09084: WORKER PROCESS EXITING 46400 1727204599.10035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.11765: done with get_vars() 46400 1727204599.11792: done getting variables 46400 1727204599.11853: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.136) 0:01:29.403 ***** 46400 1727204599.11893: entering _queue_task() for managed-node2/dnf 46400 1727204599.13449: worker is 1 (out of 1 available) 46400 1727204599.13466: exiting _queue_task() for managed-node2/dnf 46400 1727204599.13479: done queuing things up, now waiting for results queue to drain 46400 1727204599.13481: waiting for pending results... 46400 1727204599.14103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204599.14525: in run() - task 0affcd87-79f5-1303-fda8-000000001b40 46400 1727204599.14548: variable 'ansible_search_path' from source: unknown 46400 1727204599.14556: variable 'ansible_search_path' from source: unknown 46400 1727204599.14717: calling self._execute() 46400 1727204599.14824: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.14908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.14925: variable 'omit' from source: magic vars 46400 1727204599.15498: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.15516: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.15745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204599.19728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204599.19811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204599.19863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204599.19906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204599.19945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204599.20150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.20207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.20363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.20408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.20482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.20730: variable 'ansible_distribution' from source: facts 46400 1727204599.20797: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.20818: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204599.21175: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204599.21591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.21691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.21721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.21932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.21986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.22153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.22186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.22216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.22263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.22349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.22419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.22585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.22617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.22702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.22795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.23148: variable 'network_connections' from source: include params 46400 1727204599.23171: variable 'interface' from source: play vars 46400 1727204599.23261: variable 'interface' from source: play vars 46400 1727204599.23343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204599.23540: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204599.23590: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204599.23627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204599.23672: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204599.23719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204599.23745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204599.23792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.23825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204599.23887: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204599.24174: variable 'network_connections' from source: include params 46400 1727204599.24184: variable 'interface' from source: play vars 46400 1727204599.24299: variable 'interface' from source: play vars 46400 1727204599.24327: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204599.24335: when evaluation is False, skipping this task 46400 1727204599.24345: _execute() done 46400 1727204599.24351: dumping result to json 46400 1727204599.24358: done dumping result, returning 46400 1727204599.24373: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001b40] 46400 1727204599.24383: sending task result for task 0affcd87-79f5-1303-fda8-000000001b40 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204599.24537: no more pending results, returning what we have 46400 1727204599.24542: results queue empty 46400 1727204599.24543: checking for any_errors_fatal 46400 1727204599.24551: done checking for any_errors_fatal 46400 1727204599.24552: checking for max_fail_percentage 46400 1727204599.24554: done checking for max_fail_percentage 46400 1727204599.24555: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.24556: done checking to see if all hosts have failed 46400 1727204599.24556: getting the remaining hosts for this loop 46400 1727204599.24558: done getting the remaining hosts for this loop 46400 1727204599.24567: getting the next task for host managed-node2 46400 1727204599.24577: done getting next task for host managed-node2 46400 1727204599.24582: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204599.24587: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.24616: getting variables 46400 1727204599.24618: in VariableManager get_vars() 46400 1727204599.24668: Calling all_inventory to load vars for managed-node2 46400 1727204599.24671: Calling groups_inventory to load vars for managed-node2 46400 1727204599.24673: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.24685: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.24688: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.24691: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.27483: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b40 46400 1727204599.27487: WORKER PROCESS EXITING 46400 1727204599.28583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.32647: done with get_vars() 46400 1727204599.32679: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204599.33370: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.215) 0:01:29.618 ***** 46400 1727204599.33408: entering _queue_task() for managed-node2/yum 46400 1727204599.33761: worker is 1 (out of 1 available) 46400 1727204599.34514: exiting _queue_task() for managed-node2/yum 46400 1727204599.34526: done queuing things up, now waiting for results queue to drain 46400 1727204599.34527: waiting for pending results... 46400 1727204599.34547: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204599.35134: in run() - task 0affcd87-79f5-1303-fda8-000000001b41 46400 1727204599.35220: variable 'ansible_search_path' from source: unknown 46400 1727204599.35229: variable 'ansible_search_path' from source: unknown 46400 1727204599.35278: calling self._execute() 46400 1727204599.35579: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.35591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.35605: variable 'omit' from source: magic vars 46400 1727204599.36455: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.36476: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.36878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204599.40843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204599.40977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204599.41026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204599.41071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204599.41109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204599.41198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.41250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.41286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.41335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.41357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.41602: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.41621: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204599.41628: when evaluation is False, skipping this task 46400 1727204599.41635: _execute() done 46400 1727204599.41643: dumping result to json 46400 1727204599.41650: done dumping result, returning 46400 1727204599.41665: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001b41] 46400 1727204599.41678: sending task result for task 0affcd87-79f5-1303-fda8-000000001b41 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204599.41854: no more pending results, returning what we have 46400 1727204599.41862: results queue empty 46400 1727204599.41863: checking for any_errors_fatal 46400 1727204599.41874: done checking for any_errors_fatal 46400 1727204599.41875: checking for max_fail_percentage 46400 1727204599.41877: done checking for max_fail_percentage 46400 1727204599.41878: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.41879: done checking to see if all hosts have failed 46400 1727204599.41879: getting the remaining hosts for this loop 46400 1727204599.41881: done getting the remaining hosts for this loop 46400 1727204599.41885: getting the next task for host managed-node2 46400 1727204599.41897: done getting next task for host managed-node2 46400 1727204599.41902: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204599.41907: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.41937: getting variables 46400 1727204599.41939: in VariableManager get_vars() 46400 1727204599.41990: Calling all_inventory to load vars for managed-node2 46400 1727204599.41993: Calling groups_inventory to load vars for managed-node2 46400 1727204599.41996: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.42008: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.42011: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.42014: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.43372: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b41 46400 1727204599.43376: WORKER PROCESS EXITING 46400 1727204599.44349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.46635: done with get_vars() 46400 1727204599.46673: done getting variables 46400 1727204599.46739: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.133) 0:01:29.752 ***** 46400 1727204599.46785: entering _queue_task() for managed-node2/fail 46400 1727204599.47227: worker is 1 (out of 1 available) 46400 1727204599.47241: exiting _queue_task() for managed-node2/fail 46400 1727204599.47254: done queuing things up, now waiting for results queue to drain 46400 1727204599.47256: waiting for pending results... 46400 1727204599.48840: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204599.49166: in run() - task 0affcd87-79f5-1303-fda8-000000001b42 46400 1727204599.49454: variable 'ansible_search_path' from source: unknown 46400 1727204599.49461: variable 'ansible_search_path' from source: unknown 46400 1727204599.49469: calling self._execute() 46400 1727204599.49794: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.49798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.49802: variable 'omit' from source: magic vars 46400 1727204599.51003: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.51100: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.51245: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204599.51462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204599.54511: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204599.54596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204599.54642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204599.54697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204599.54734: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204599.54828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.54884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.54922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.54974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.54995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.55050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.55085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.55116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.55211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.55363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.55497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.55591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.55622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.55710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.55762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.56003: variable 'network_connections' from source: include params 46400 1727204599.56023: variable 'interface' from source: play vars 46400 1727204599.56112: variable 'interface' from source: play vars 46400 1727204599.56266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204599.56505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204599.56548: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204599.56593: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204599.56628: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204599.56681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204599.56713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204599.56744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.56781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204599.56840: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204599.57120: variable 'network_connections' from source: include params 46400 1727204599.57133: variable 'interface' from source: play vars 46400 1727204599.57207: variable 'interface' from source: play vars 46400 1727204599.57243: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204599.57252: when evaluation is False, skipping this task 46400 1727204599.57258: _execute() done 46400 1727204599.57270: dumping result to json 46400 1727204599.57278: done dumping result, returning 46400 1727204599.57291: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001b42] 46400 1727204599.57301: sending task result for task 0affcd87-79f5-1303-fda8-000000001b42 46400 1727204599.57439: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b42 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204599.57507: no more pending results, returning what we have 46400 1727204599.57511: results queue empty 46400 1727204599.57512: checking for any_errors_fatal 46400 1727204599.57519: done checking for any_errors_fatal 46400 1727204599.57519: checking for max_fail_percentage 46400 1727204599.57521: done checking for max_fail_percentage 46400 1727204599.57522: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.57523: done checking to see if all hosts have failed 46400 1727204599.57524: getting the remaining hosts for this loop 46400 1727204599.57526: done getting the remaining hosts for this loop 46400 1727204599.57530: getting the next task for host managed-node2 46400 1727204599.57541: done getting next task for host managed-node2 46400 1727204599.57546: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204599.57551: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.57585: getting variables 46400 1727204599.57587: in VariableManager get_vars() 46400 1727204599.57633: Calling all_inventory to load vars for managed-node2 46400 1727204599.57636: Calling groups_inventory to load vars for managed-node2 46400 1727204599.57640: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.57652: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.57654: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.57658: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.59010: WORKER PROCESS EXITING 46400 1727204599.60339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.62686: done with get_vars() 46400 1727204599.62724: done getting variables 46400 1727204599.62794: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.160) 0:01:29.912 ***** 46400 1727204599.62833: entering _queue_task() for managed-node2/package 46400 1727204599.63194: worker is 1 (out of 1 available) 46400 1727204599.63206: exiting _queue_task() for managed-node2/package 46400 1727204599.63219: done queuing things up, now waiting for results queue to drain 46400 1727204599.63220: waiting for pending results... 46400 1727204599.63697: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204599.63841: in run() - task 0affcd87-79f5-1303-fda8-000000001b43 46400 1727204599.63868: variable 'ansible_search_path' from source: unknown 46400 1727204599.63880: variable 'ansible_search_path' from source: unknown 46400 1727204599.63922: calling self._execute() 46400 1727204599.64033: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.64045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.64057: variable 'omit' from source: magic vars 46400 1727204599.64475: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.64493: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.64926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204599.65250: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204599.65314: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204599.65351: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204599.65438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204599.65575: variable 'network_packages' from source: role '' defaults 46400 1727204599.65696: variable '__network_provider_setup' from source: role '' defaults 46400 1727204599.65711: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204599.65785: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204599.65800: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204599.65868: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204599.66147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204599.69245: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204599.69325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204599.69378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204599.69415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204599.69449: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204599.69536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.69701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.69735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.69866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.69892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.69942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.69977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.70012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.70058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.70084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.70354: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204599.70488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.70516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.70551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.70601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.70621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.70728: variable 'ansible_python' from source: facts 46400 1727204599.70753: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204599.70847: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204599.70941: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204599.71084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.71116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.71145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.71194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.71217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.71314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204599.71353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204599.71388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.71510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204599.71602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204599.71785: variable 'network_connections' from source: include params 46400 1727204599.71797: variable 'interface' from source: play vars 46400 1727204599.71908: variable 'interface' from source: play vars 46400 1727204599.72012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204599.72101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204599.72239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204599.72281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204599.72336: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204599.72692: variable 'network_connections' from source: include params 46400 1727204599.72703: variable 'interface' from source: play vars 46400 1727204599.72833: variable 'interface' from source: play vars 46400 1727204599.72997: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204599.73134: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204599.73484: variable 'network_connections' from source: include params 46400 1727204599.73494: variable 'interface' from source: play vars 46400 1727204599.73570: variable 'interface' from source: play vars 46400 1727204599.73597: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204599.73686: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204599.74035: variable 'network_connections' from source: include params 46400 1727204599.74051: variable 'interface' from source: play vars 46400 1727204599.74193: variable 'interface' from source: play vars 46400 1727204599.74249: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204599.74343: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204599.74356: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204599.74424: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204599.74652: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204599.75395: variable 'network_connections' from source: include params 46400 1727204599.75405: variable 'interface' from source: play vars 46400 1727204599.75591: variable 'interface' from source: play vars 46400 1727204599.75643: variable 'ansible_distribution' from source: facts 46400 1727204599.75651: variable '__network_rh_distros' from source: role '' defaults 46400 1727204599.75661: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.75683: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204599.76026: variable 'ansible_distribution' from source: facts 46400 1727204599.76077: variable '__network_rh_distros' from source: role '' defaults 46400 1727204599.76087: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.76104: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204599.76420: variable 'ansible_distribution' from source: facts 46400 1727204599.76514: variable '__network_rh_distros' from source: role '' defaults 46400 1727204599.76524: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.76570: variable 'network_provider' from source: set_fact 46400 1727204599.76590: variable 'ansible_facts' from source: unknown 46400 1727204599.77621: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204599.77629: when evaluation is False, skipping this task 46400 1727204599.77635: _execute() done 46400 1727204599.77642: dumping result to json 46400 1727204599.77648: done dumping result, returning 46400 1727204599.77662: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000001b43] 46400 1727204599.77679: sending task result for task 0affcd87-79f5-1303-fda8-000000001b43 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204599.77879: no more pending results, returning what we have 46400 1727204599.77884: results queue empty 46400 1727204599.77885: checking for any_errors_fatal 46400 1727204599.77894: done checking for any_errors_fatal 46400 1727204599.77895: checking for max_fail_percentage 46400 1727204599.77897: done checking for max_fail_percentage 46400 1727204599.77898: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.77899: done checking to see if all hosts have failed 46400 1727204599.77899: getting the remaining hosts for this loop 46400 1727204599.77901: done getting the remaining hosts for this loop 46400 1727204599.77905: getting the next task for host managed-node2 46400 1727204599.77914: done getting next task for host managed-node2 46400 1727204599.77919: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204599.77925: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.77975: getting variables 46400 1727204599.77981: in VariableManager get_vars() 46400 1727204599.78046: Calling all_inventory to load vars for managed-node2 46400 1727204599.78049: Calling groups_inventory to load vars for managed-node2 46400 1727204599.78060: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.78073: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.78076: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.78079: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.79121: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b43 46400 1727204599.79125: WORKER PROCESS EXITING 46400 1727204599.81391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.83886: done with get_vars() 46400 1727204599.83925: done getting variables 46400 1727204599.84013: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.212) 0:01:30.125 ***** 46400 1727204599.84149: entering _queue_task() for managed-node2/package 46400 1727204599.84614: worker is 1 (out of 1 available) 46400 1727204599.84627: exiting _queue_task() for managed-node2/package 46400 1727204599.84640: done queuing things up, now waiting for results queue to drain 46400 1727204599.84642: waiting for pending results... 46400 1727204599.84951: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204599.85130: in run() - task 0affcd87-79f5-1303-fda8-000000001b44 46400 1727204599.85155: variable 'ansible_search_path' from source: unknown 46400 1727204599.85168: variable 'ansible_search_path' from source: unknown 46400 1727204599.85211: calling self._execute() 46400 1727204599.85337: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.85351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.85374: variable 'omit' from source: magic vars 46400 1727204599.85833: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.85854: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.86012: variable 'network_state' from source: role '' defaults 46400 1727204599.86058: Evaluated conditional (network_state != {}): False 46400 1727204599.86083: when evaluation is False, skipping this task 46400 1727204599.86091: _execute() done 46400 1727204599.86111: dumping result to json 46400 1727204599.86118: done dumping result, returning 46400 1727204599.86128: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001b44] 46400 1727204599.86147: sending task result for task 0affcd87-79f5-1303-fda8-000000001b44 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204599.86986: no more pending results, returning what we have 46400 1727204599.86991: results queue empty 46400 1727204599.86992: checking for any_errors_fatal 46400 1727204599.87000: done checking for any_errors_fatal 46400 1727204599.87001: checking for max_fail_percentage 46400 1727204599.87003: done checking for max_fail_percentage 46400 1727204599.87004: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.87004: done checking to see if all hosts have failed 46400 1727204599.87005: getting the remaining hosts for this loop 46400 1727204599.87007: done getting the remaining hosts for this loop 46400 1727204599.87011: getting the next task for host managed-node2 46400 1727204599.87021: done getting next task for host managed-node2 46400 1727204599.87026: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204599.87032: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.87069: getting variables 46400 1727204599.87071: in VariableManager get_vars() 46400 1727204599.87118: Calling all_inventory to load vars for managed-node2 46400 1727204599.87120: Calling groups_inventory to load vars for managed-node2 46400 1727204599.87123: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.87137: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.87139: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.87142: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.89348: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b44 46400 1727204599.89352: WORKER PROCESS EXITING 46400 1727204599.91756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.93486: done with get_vars() 46400 1727204599.93515: done getting variables 46400 1727204599.93561: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.094) 0:01:30.220 ***** 46400 1727204599.93593: entering _queue_task() for managed-node2/package 46400 1727204599.93847: worker is 1 (out of 1 available) 46400 1727204599.93862: exiting _queue_task() for managed-node2/package 46400 1727204599.93876: done queuing things up, now waiting for results queue to drain 46400 1727204599.93878: waiting for pending results... 46400 1727204599.94110: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204599.94289: in run() - task 0affcd87-79f5-1303-fda8-000000001b45 46400 1727204599.94308: variable 'ansible_search_path' from source: unknown 46400 1727204599.94314: variable 'ansible_search_path' from source: unknown 46400 1727204599.94363: calling self._execute() 46400 1727204599.94475: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.94493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.94506: variable 'omit' from source: magic vars 46400 1727204599.94910: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.94926: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.95058: variable 'network_state' from source: role '' defaults 46400 1727204599.95079: Evaluated conditional (network_state != {}): False 46400 1727204599.95086: when evaluation is False, skipping this task 46400 1727204599.95096: _execute() done 46400 1727204599.95104: dumping result to json 46400 1727204599.95111: done dumping result, returning 46400 1727204599.95123: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001b45] 46400 1727204599.95133: sending task result for task 0affcd87-79f5-1303-fda8-000000001b45 46400 1727204599.95245: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b45 46400 1727204599.95253: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204599.95609: no more pending results, returning what we have 46400 1727204599.95613: results queue empty 46400 1727204599.95614: checking for any_errors_fatal 46400 1727204599.95620: done checking for any_errors_fatal 46400 1727204599.95621: checking for max_fail_percentage 46400 1727204599.95623: done checking for max_fail_percentage 46400 1727204599.95623: checking to see if all hosts have failed and the running result is not ok 46400 1727204599.95624: done checking to see if all hosts have failed 46400 1727204599.95625: getting the remaining hosts for this loop 46400 1727204599.95626: done getting the remaining hosts for this loop 46400 1727204599.95630: getting the next task for host managed-node2 46400 1727204599.95638: done getting next task for host managed-node2 46400 1727204599.95642: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204599.95647: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204599.95671: getting variables 46400 1727204599.95673: in VariableManager get_vars() 46400 1727204599.95716: Calling all_inventory to load vars for managed-node2 46400 1727204599.95719: Calling groups_inventory to load vars for managed-node2 46400 1727204599.95724: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204599.95733: Calling all_plugins_play to load vars for managed-node2 46400 1727204599.95738: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204599.95741: Calling groups_plugins_play to load vars for managed-node2 46400 1727204599.97097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204599.98426: done with get_vars() 46400 1727204599.98449: done getting variables 46400 1727204599.98502: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.049) 0:01:30.269 ***** 46400 1727204599.98530: entering _queue_task() for managed-node2/service 46400 1727204599.98784: worker is 1 (out of 1 available) 46400 1727204599.98799: exiting _queue_task() for managed-node2/service 46400 1727204599.98811: done queuing things up, now waiting for results queue to drain 46400 1727204599.98812: waiting for pending results... 46400 1727204599.99013: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204599.99112: in run() - task 0affcd87-79f5-1303-fda8-000000001b46 46400 1727204599.99123: variable 'ansible_search_path' from source: unknown 46400 1727204599.99127: variable 'ansible_search_path' from source: unknown 46400 1727204599.99161: calling self._execute() 46400 1727204599.99245: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204599.99256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204599.99267: variable 'omit' from source: magic vars 46400 1727204599.99551: variable 'ansible_distribution_major_version' from source: facts 46400 1727204599.99561: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204599.99874: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204599.99879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204600.02190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204600.02234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204600.02260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204600.02290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204600.02312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204600.02374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.02397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.02417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.02444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.02455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.02493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.02513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.02528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.02554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.02568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.02597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.02617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.02634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.02658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.02672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.02789: variable 'network_connections' from source: include params 46400 1727204600.02799: variable 'interface' from source: play vars 46400 1727204600.02852: variable 'interface' from source: play vars 46400 1727204600.02904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204600.03029: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204600.03058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204600.03085: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204600.03106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204600.03137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204600.03156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204600.03179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.03197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204600.03235: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204600.03398: variable 'network_connections' from source: include params 46400 1727204600.03401: variable 'interface' from source: play vars 46400 1727204600.03445: variable 'interface' from source: play vars 46400 1727204600.03470: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204600.03474: when evaluation is False, skipping this task 46400 1727204600.03477: _execute() done 46400 1727204600.03480: dumping result to json 46400 1727204600.03482: done dumping result, returning 46400 1727204600.03485: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001b46] 46400 1727204600.03492: sending task result for task 0affcd87-79f5-1303-fda8-000000001b46 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204600.03636: no more pending results, returning what we have 46400 1727204600.03641: results queue empty 46400 1727204600.03642: checking for any_errors_fatal 46400 1727204600.03649: done checking for any_errors_fatal 46400 1727204600.03650: checking for max_fail_percentage 46400 1727204600.03651: done checking for max_fail_percentage 46400 1727204600.03652: checking to see if all hosts have failed and the running result is not ok 46400 1727204600.03653: done checking to see if all hosts have failed 46400 1727204600.03654: getting the remaining hosts for this loop 46400 1727204600.03655: done getting the remaining hosts for this loop 46400 1727204600.03659: getting the next task for host managed-node2 46400 1727204600.03670: done getting next task for host managed-node2 46400 1727204600.03674: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204600.03679: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204600.03705: getting variables 46400 1727204600.03707: in VariableManager get_vars() 46400 1727204600.03749: Calling all_inventory to load vars for managed-node2 46400 1727204600.03751: Calling groups_inventory to load vars for managed-node2 46400 1727204600.03754: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204600.03765: Calling all_plugins_play to load vars for managed-node2 46400 1727204600.03767: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204600.03770: Calling groups_plugins_play to load vars for managed-node2 46400 1727204600.04729: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b46 46400 1727204600.04733: WORKER PROCESS EXITING 46400 1727204600.04747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204600.05685: done with get_vars() 46400 1727204600.05703: done getting variables 46400 1727204600.05748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.072) 0:01:30.342 ***** 46400 1727204600.05775: entering _queue_task() for managed-node2/service 46400 1727204600.06017: worker is 1 (out of 1 available) 46400 1727204600.06033: exiting _queue_task() for managed-node2/service 46400 1727204600.06046: done queuing things up, now waiting for results queue to drain 46400 1727204600.06048: waiting for pending results... 46400 1727204600.06258: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204600.06358: in run() - task 0affcd87-79f5-1303-fda8-000000001b47 46400 1727204600.06380: variable 'ansible_search_path' from source: unknown 46400 1727204600.06384: variable 'ansible_search_path' from source: unknown 46400 1727204600.06412: calling self._execute() 46400 1727204600.06493: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204600.06497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204600.06510: variable 'omit' from source: magic vars 46400 1727204600.06786: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.06795: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204600.06912: variable 'network_provider' from source: set_fact 46400 1727204600.06916: variable 'network_state' from source: role '' defaults 46400 1727204600.06925: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204600.06931: variable 'omit' from source: magic vars 46400 1727204600.06975: variable 'omit' from source: magic vars 46400 1727204600.06994: variable 'network_service_name' from source: role '' defaults 46400 1727204600.07077: variable 'network_service_name' from source: role '' defaults 46400 1727204600.07215: variable '__network_provider_setup' from source: role '' defaults 46400 1727204600.07246: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204600.07321: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204600.07334: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204600.07413: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204600.07668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204600.09577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204600.09623: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204600.09651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204600.09691: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204600.09712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204600.09773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.09793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.09813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.09839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.09851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.09888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.09904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.09923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.09948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.09990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.10473: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204600.10478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.10481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.10483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.10486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.10488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.10490: variable 'ansible_python' from source: facts 46400 1727204600.10492: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204600.10773: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204600.10776: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204600.10778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.10798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.10818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.10855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.10871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.10916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.10940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.10963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.11000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.11013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.11145: variable 'network_connections' from source: include params 46400 1727204600.11151: variable 'interface' from source: play vars 46400 1727204600.11225: variable 'interface' from source: play vars 46400 1727204600.11329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204600.11512: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204600.11559: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204600.11605: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204600.11643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204600.11718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204600.11748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204600.11784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.11817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204600.11865: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204600.12143: variable 'network_connections' from source: include params 46400 1727204600.12149: variable 'interface' from source: play vars 46400 1727204600.12227: variable 'interface' from source: play vars 46400 1727204600.12256: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204600.12338: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204600.12631: variable 'network_connections' from source: include params 46400 1727204600.12634: variable 'interface' from source: play vars 46400 1727204600.12707: variable 'interface' from source: play vars 46400 1727204600.12728: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204600.12807: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204600.13098: variable 'network_connections' from source: include params 46400 1727204600.13101: variable 'interface' from source: play vars 46400 1727204600.13172: variable 'interface' from source: play vars 46400 1727204600.13223: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204600.13286: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204600.13293: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204600.13352: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204600.13574: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204600.14061: variable 'network_connections' from source: include params 46400 1727204600.14070: variable 'interface' from source: play vars 46400 1727204600.14129: variable 'interface' from source: play vars 46400 1727204600.14135: variable 'ansible_distribution' from source: facts 46400 1727204600.14138: variable '__network_rh_distros' from source: role '' defaults 46400 1727204600.14145: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.14158: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204600.14331: variable 'ansible_distribution' from source: facts 46400 1727204600.14335: variable '__network_rh_distros' from source: role '' defaults 46400 1727204600.14340: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.14352: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204600.14523: variable 'ansible_distribution' from source: facts 46400 1727204600.14527: variable '__network_rh_distros' from source: role '' defaults 46400 1727204600.14532: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.14571: variable 'network_provider' from source: set_fact 46400 1727204600.14595: variable 'omit' from source: magic vars 46400 1727204600.14622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204600.14649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204600.14672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204600.14690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204600.14700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204600.14729: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204600.14732: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204600.14736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204600.14824: Set connection var ansible_shell_type to sh 46400 1727204600.14833: Set connection var ansible_shell_executable to /bin/sh 46400 1727204600.14839: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204600.14844: Set connection var ansible_connection to ssh 46400 1727204600.14850: Set connection var ansible_pipelining to False 46400 1727204600.14855: Set connection var ansible_timeout to 10 46400 1727204600.14888: variable 'ansible_shell_executable' from source: unknown 46400 1727204600.14892: variable 'ansible_connection' from source: unknown 46400 1727204600.14894: variable 'ansible_module_compression' from source: unknown 46400 1727204600.14896: variable 'ansible_shell_type' from source: unknown 46400 1727204600.14898: variable 'ansible_shell_executable' from source: unknown 46400 1727204600.14901: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204600.14904: variable 'ansible_pipelining' from source: unknown 46400 1727204600.14907: variable 'ansible_timeout' from source: unknown 46400 1727204600.14913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204600.15006: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204600.15017: variable 'omit' from source: magic vars 46400 1727204600.15023: starting attempt loop 46400 1727204600.15025: running the handler 46400 1727204600.15103: variable 'ansible_facts' from source: unknown 46400 1727204600.16116: _low_level_execute_command(): starting 46400 1727204600.16123: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204600.16873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204600.16887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.16897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.16911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.16951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.16958: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204600.16977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.16989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.16996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204600.17003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204600.17011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.17020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.17031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.17038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.17045: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204600.17054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.17130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204600.17149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.17161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.17245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.18912: stdout chunk (state=3): >>>/root <<< 46400 1727204600.19014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204600.19111: stderr chunk (state=3): >>><<< 46400 1727204600.19117: stdout chunk (state=3): >>><<< 46400 1727204600.19147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204600.19160: _low_level_execute_command(): starting 46400 1727204600.19172: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284 `" && echo ansible-tmp-1727204600.1914744-52789-229388157109284="` echo /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284 `" ) && sleep 0' 46400 1727204600.19844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204600.19853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.19867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.19884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.19930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.19937: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204600.19947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.19961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.19974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204600.19980: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204600.19988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.19997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.20036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.20044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.20051: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204600.20060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.20137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204600.20152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.20159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.20244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.22116: stdout chunk (state=3): >>>ansible-tmp-1727204600.1914744-52789-229388157109284=/root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284 <<< 46400 1727204600.22231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204600.22296: stderr chunk (state=3): >>><<< 46400 1727204600.22298: stdout chunk (state=3): >>><<< 46400 1727204600.22327: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204600.1914744-52789-229388157109284=/root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204600.22340: variable 'ansible_module_compression' from source: unknown 46400 1727204600.22388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204600.22434: variable 'ansible_facts' from source: unknown 46400 1727204600.22574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/AnsiballZ_systemd.py 46400 1727204600.22692: Sending initial data 46400 1727204600.22696: Sent initial data (156 bytes) 46400 1727204600.23400: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.23407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.23439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204600.23445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.23454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.23463: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.23477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.23483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.23535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204600.23542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.23553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.23615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.25323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204600.25360: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204600.25403: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp50ed1w36 /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/AnsiballZ_systemd.py <<< 46400 1727204600.25439: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204600.28177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204600.28272: stderr chunk (state=3): >>><<< 46400 1727204600.28276: stdout chunk (state=3): >>><<< 46400 1727204600.28296: done transferring module to remote 46400 1727204600.28306: _low_level_execute_command(): starting 46400 1727204600.28311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/ /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/AnsiballZ_systemd.py && sleep 0' 46400 1727204600.28942: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204600.28952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.28959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.28979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.29039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.29069: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204600.29088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.29116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.29123: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204600.29129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204600.29136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.29145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.29155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.29165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.29176: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204600.29184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.29280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.29297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.29358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.31103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204600.31190: stderr chunk (state=3): >>><<< 46400 1727204600.31193: stdout chunk (state=3): >>><<< 46400 1727204600.31228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204600.31232: _low_level_execute_command(): starting 46400 1727204600.31236: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/AnsiballZ_systemd.py && sleep 0' 46400 1727204600.32076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204600.32079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.32085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.32087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.32089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.32091: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204600.32093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.32095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.32096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204600.32098: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204600.32100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.32102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.32104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.32105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.32107: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204600.32109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.32143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204600.32147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.32153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.32221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.57489: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204600.57536: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2204417000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204600.57545: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204600.59102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204600.59107: stderr chunk (state=3): >>><<< 46400 1727204600.59109: stdout chunk (state=3): >>><<< 46400 1727204600.59134: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2204417000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204600.59325: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204600.59343: _low_level_execute_command(): starting 46400 1727204600.59348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204600.1914744-52789-229388157109284/ > /dev/null 2>&1 && sleep 0' 46400 1727204600.60696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204600.60784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.60794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.60809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.60892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.60899: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204600.60911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.60976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204600.60984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204600.60991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204600.60999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204600.61008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204600.61019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204600.61026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204600.61033: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204600.61042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204600.61159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204600.61209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204600.61305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204600.61518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204600.63322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204600.63326: stdout chunk (state=3): >>><<< 46400 1727204600.63333: stderr chunk (state=3): >>><<< 46400 1727204600.63349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204600.63357: handler run complete 46400 1727204600.63423: attempt loop complete, returning result 46400 1727204600.63426: _execute() done 46400 1727204600.63429: dumping result to json 46400 1727204600.63448: done dumping result, returning 46400 1727204600.63454: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000001b47] 46400 1727204600.63460: sending task result for task 0affcd87-79f5-1303-fda8-000000001b47 46400 1727204600.63703: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b47 46400 1727204600.63706: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204600.63766: no more pending results, returning what we have 46400 1727204600.63770: results queue empty 46400 1727204600.63771: checking for any_errors_fatal 46400 1727204600.63780: done checking for any_errors_fatal 46400 1727204600.63781: checking for max_fail_percentage 46400 1727204600.63783: done checking for max_fail_percentage 46400 1727204600.63784: checking to see if all hosts have failed and the running result is not ok 46400 1727204600.63785: done checking to see if all hosts have failed 46400 1727204600.63786: getting the remaining hosts for this loop 46400 1727204600.63787: done getting the remaining hosts for this loop 46400 1727204600.63792: getting the next task for host managed-node2 46400 1727204600.63801: done getting next task for host managed-node2 46400 1727204600.63806: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204600.63811: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204600.63826: getting variables 46400 1727204600.63828: in VariableManager get_vars() 46400 1727204600.63877: Calling all_inventory to load vars for managed-node2 46400 1727204600.63880: Calling groups_inventory to load vars for managed-node2 46400 1727204600.63883: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204600.63894: Calling all_plugins_play to load vars for managed-node2 46400 1727204600.63896: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204600.63899: Calling groups_plugins_play to load vars for managed-node2 46400 1727204600.66446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204600.71359: done with get_vars() 46400 1727204600.71396: done getting variables 46400 1727204600.71458: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.657) 0:01:30.999 ***** 46400 1727204600.71496: entering _queue_task() for managed-node2/service 46400 1727204600.72540: worker is 1 (out of 1 available) 46400 1727204600.72552: exiting _queue_task() for managed-node2/service 46400 1727204600.72571: done queuing things up, now waiting for results queue to drain 46400 1727204600.72572: waiting for pending results... 46400 1727204600.73253: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204600.73395: in run() - task 0affcd87-79f5-1303-fda8-000000001b48 46400 1727204600.73408: variable 'ansible_search_path' from source: unknown 46400 1727204600.73412: variable 'ansible_search_path' from source: unknown 46400 1727204600.73446: calling self._execute() 46400 1727204600.73540: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204600.73544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204600.73557: variable 'omit' from source: magic vars 46400 1727204600.74601: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.74612: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204600.74734: variable 'network_provider' from source: set_fact 46400 1727204600.74739: Evaluated conditional (network_provider == "nm"): True 46400 1727204600.74867: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204600.75170: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204600.75520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204600.81174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204600.81236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204600.81281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204600.81317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204600.81346: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204600.81446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.81480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.81507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.81549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.81568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.81615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.81639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.81668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.81713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.81720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.81761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204600.82594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204600.82622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.82659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204600.82679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204600.83170: variable 'network_connections' from source: include params 46400 1727204600.83173: variable 'interface' from source: play vars 46400 1727204600.83210: variable 'interface' from source: play vars 46400 1727204600.83291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204600.83474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204600.83512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204600.83541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204600.83573: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204600.83614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204600.83635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204600.83657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204600.83687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204600.83734: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204600.84413: variable 'network_connections' from source: include params 46400 1727204600.84419: variable 'interface' from source: play vars 46400 1727204600.84710: variable 'interface' from source: play vars 46400 1727204600.84741: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204600.84744: when evaluation is False, skipping this task 46400 1727204600.84868: _execute() done 46400 1727204600.84871: dumping result to json 46400 1727204600.84874: done dumping result, returning 46400 1727204600.84882: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000001b48] 46400 1727204600.84894: sending task result for task 0affcd87-79f5-1303-fda8-000000001b48 46400 1727204600.84989: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b48 46400 1727204600.84993: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204600.85070: no more pending results, returning what we have 46400 1727204600.85075: results queue empty 46400 1727204600.85076: checking for any_errors_fatal 46400 1727204600.85096: done checking for any_errors_fatal 46400 1727204600.85097: checking for max_fail_percentage 46400 1727204600.85099: done checking for max_fail_percentage 46400 1727204600.85100: checking to see if all hosts have failed and the running result is not ok 46400 1727204600.85101: done checking to see if all hosts have failed 46400 1727204600.85101: getting the remaining hosts for this loop 46400 1727204600.85103: done getting the remaining hosts for this loop 46400 1727204600.85107: getting the next task for host managed-node2 46400 1727204600.85116: done getting next task for host managed-node2 46400 1727204600.85120: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204600.85125: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204600.85152: getting variables 46400 1727204600.85154: in VariableManager get_vars() 46400 1727204600.85202: Calling all_inventory to load vars for managed-node2 46400 1727204600.85205: Calling groups_inventory to load vars for managed-node2 46400 1727204600.85208: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204600.85219: Calling all_plugins_play to load vars for managed-node2 46400 1727204600.85222: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204600.85226: Calling groups_plugins_play to load vars for managed-node2 46400 1727204600.87540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204600.90567: done with get_vars() 46400 1727204600.90612: done getting variables 46400 1727204600.90709: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.192) 0:01:31.192 ***** 46400 1727204600.90745: entering _queue_task() for managed-node2/service 46400 1727204600.91169: worker is 1 (out of 1 available) 46400 1727204600.91182: exiting _queue_task() for managed-node2/service 46400 1727204600.91198: done queuing things up, now waiting for results queue to drain 46400 1727204600.91200: waiting for pending results... 46400 1727204600.91530: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204600.91698: in run() - task 0affcd87-79f5-1303-fda8-000000001b49 46400 1727204600.91720: variable 'ansible_search_path' from source: unknown 46400 1727204600.91728: variable 'ansible_search_path' from source: unknown 46400 1727204600.91777: calling self._execute() 46400 1727204600.91895: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204600.91911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204600.91925: variable 'omit' from source: magic vars 46400 1727204600.92339: variable 'ansible_distribution_major_version' from source: facts 46400 1727204600.92357: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204600.92485: variable 'network_provider' from source: set_fact 46400 1727204600.92496: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204600.92504: when evaluation is False, skipping this task 46400 1727204600.92512: _execute() done 46400 1727204600.92521: dumping result to json 46400 1727204600.92528: done dumping result, returning 46400 1727204600.92537: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000001b49] 46400 1727204600.92550: sending task result for task 0affcd87-79f5-1303-fda8-000000001b49 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204600.92712: no more pending results, returning what we have 46400 1727204600.92716: results queue empty 46400 1727204600.92718: checking for any_errors_fatal 46400 1727204600.92728: done checking for any_errors_fatal 46400 1727204600.92728: checking for max_fail_percentage 46400 1727204600.92730: done checking for max_fail_percentage 46400 1727204600.92732: checking to see if all hosts have failed and the running result is not ok 46400 1727204600.92732: done checking to see if all hosts have failed 46400 1727204600.92733: getting the remaining hosts for this loop 46400 1727204600.92735: done getting the remaining hosts for this loop 46400 1727204600.92739: getting the next task for host managed-node2 46400 1727204600.92750: done getting next task for host managed-node2 46400 1727204600.92755: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204600.92760: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204600.92793: getting variables 46400 1727204600.92796: in VariableManager get_vars() 46400 1727204600.92845: Calling all_inventory to load vars for managed-node2 46400 1727204600.92848: Calling groups_inventory to load vars for managed-node2 46400 1727204600.92851: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204600.92866: Calling all_plugins_play to load vars for managed-node2 46400 1727204600.92869: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204600.92872: Calling groups_plugins_play to load vars for managed-node2 46400 1727204600.94302: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b49 46400 1727204600.94306: WORKER PROCESS EXITING 46400 1727204601.10774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204601.14522: done with get_vars() 46400 1727204601.14566: done getting variables 46400 1727204601.14617: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.239) 0:01:31.431 ***** 46400 1727204601.14702: entering _queue_task() for managed-node2/copy 46400 1727204601.15525: worker is 1 (out of 1 available) 46400 1727204601.15538: exiting _queue_task() for managed-node2/copy 46400 1727204601.15552: done queuing things up, now waiting for results queue to drain 46400 1727204601.15553: waiting for pending results... 46400 1727204601.16922: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204601.17436: in run() - task 0affcd87-79f5-1303-fda8-000000001b4a 46400 1727204601.17476: variable 'ansible_search_path' from source: unknown 46400 1727204601.17486: variable 'ansible_search_path' from source: unknown 46400 1727204601.17531: calling self._execute() 46400 1727204601.17642: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.17780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.17796: variable 'omit' from source: magic vars 46400 1727204601.18192: variable 'ansible_distribution_major_version' from source: facts 46400 1727204601.18888: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204601.19025: variable 'network_provider' from source: set_fact 46400 1727204601.19037: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204601.19045: when evaluation is False, skipping this task 46400 1727204601.19053: _execute() done 46400 1727204601.19062: dumping result to json 46400 1727204601.19072: done dumping result, returning 46400 1727204601.19084: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000001b4a] 46400 1727204601.19096: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4a skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204601.19284: no more pending results, returning what we have 46400 1727204601.19288: results queue empty 46400 1727204601.19289: checking for any_errors_fatal 46400 1727204601.19298: done checking for any_errors_fatal 46400 1727204601.19299: checking for max_fail_percentage 46400 1727204601.19302: done checking for max_fail_percentage 46400 1727204601.19303: checking to see if all hosts have failed and the running result is not ok 46400 1727204601.19304: done checking to see if all hosts have failed 46400 1727204601.19305: getting the remaining hosts for this loop 46400 1727204601.19307: done getting the remaining hosts for this loop 46400 1727204601.19310: getting the next task for host managed-node2 46400 1727204601.19319: done getting next task for host managed-node2 46400 1727204601.19323: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204601.19329: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204601.19364: getting variables 46400 1727204601.19367: in VariableManager get_vars() 46400 1727204601.19414: Calling all_inventory to load vars for managed-node2 46400 1727204601.19417: Calling groups_inventory to load vars for managed-node2 46400 1727204601.19419: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204601.19433: Calling all_plugins_play to load vars for managed-node2 46400 1727204601.19436: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204601.19439: Calling groups_plugins_play to load vars for managed-node2 46400 1727204601.20006: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4a 46400 1727204601.20009: WORKER PROCESS EXITING 46400 1727204601.21133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204601.23243: done with get_vars() 46400 1727204601.23285: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.086) 0:01:31.518 ***** 46400 1727204601.23391: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204601.23783: worker is 1 (out of 1 available) 46400 1727204601.23796: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204601.23808: done queuing things up, now waiting for results queue to drain 46400 1727204601.23809: waiting for pending results... 46400 1727204601.24253: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204601.24713: in run() - task 0affcd87-79f5-1303-fda8-000000001b4b 46400 1727204601.24728: variable 'ansible_search_path' from source: unknown 46400 1727204601.24731: variable 'ansible_search_path' from source: unknown 46400 1727204601.24778: calling self._execute() 46400 1727204601.25012: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.25016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.25144: variable 'omit' from source: magic vars 46400 1727204601.26021: variable 'ansible_distribution_major_version' from source: facts 46400 1727204601.26033: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204601.26040: variable 'omit' from source: magic vars 46400 1727204601.26113: variable 'omit' from source: magic vars 46400 1727204601.26285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204601.30309: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204601.30466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204601.30522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204601.30574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204601.30602: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204601.30695: variable 'network_provider' from source: set_fact 46400 1727204601.30849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204601.30878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204601.31021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204601.31066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204601.31127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204601.31205: variable 'omit' from source: magic vars 46400 1727204601.31388: variable 'omit' from source: magic vars 46400 1727204601.31616: variable 'network_connections' from source: include params 46400 1727204601.31629: variable 'interface' from source: play vars 46400 1727204601.31808: variable 'interface' from source: play vars 46400 1727204601.32070: variable 'omit' from source: magic vars 46400 1727204601.32079: variable '__lsr_ansible_managed' from source: task vars 46400 1727204601.32276: variable '__lsr_ansible_managed' from source: task vars 46400 1727204601.32680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204601.33141: Loaded config def from plugin (lookup/template) 46400 1727204601.33145: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204601.33178: File lookup term: get_ansible_managed.j2 46400 1727204601.33181: variable 'ansible_search_path' from source: unknown 46400 1727204601.33185: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204601.33205: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204601.33220: variable 'ansible_search_path' from source: unknown 46400 1727204601.43872: variable 'ansible_managed' from source: unknown 46400 1727204601.43877: variable 'omit' from source: magic vars 46400 1727204601.43880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204601.43882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204601.43884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204601.43887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204601.43889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204601.43896: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204601.43899: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.43904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.44012: Set connection var ansible_shell_type to sh 46400 1727204601.44021: Set connection var ansible_shell_executable to /bin/sh 46400 1727204601.44026: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204601.44032: Set connection var ansible_connection to ssh 46400 1727204601.44038: Set connection var ansible_pipelining to False 46400 1727204601.44043: Set connection var ansible_timeout to 10 46400 1727204601.44081: variable 'ansible_shell_executable' from source: unknown 46400 1727204601.44084: variable 'ansible_connection' from source: unknown 46400 1727204601.44087: variable 'ansible_module_compression' from source: unknown 46400 1727204601.44090: variable 'ansible_shell_type' from source: unknown 46400 1727204601.44092: variable 'ansible_shell_executable' from source: unknown 46400 1727204601.44095: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.44100: variable 'ansible_pipelining' from source: unknown 46400 1727204601.44102: variable 'ansible_timeout' from source: unknown 46400 1727204601.44107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.44255: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204601.44272: variable 'omit' from source: magic vars 46400 1727204601.44279: starting attempt loop 46400 1727204601.44282: running the handler 46400 1727204601.44304: _low_level_execute_command(): starting 46400 1727204601.44310: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204601.45071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.45087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.45098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.45114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.45157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.45167: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.45181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.45198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.45205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.45212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.45219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.45228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.45240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.45247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.45254: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.45265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.45351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.45372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.45384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.45466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.47118: stdout chunk (state=3): >>>/root <<< 46400 1727204601.47290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204601.47342: stderr chunk (state=3): >>><<< 46400 1727204601.47345: stdout chunk (state=3): >>><<< 46400 1727204601.47379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204601.47392: _low_level_execute_command(): starting 46400 1727204601.47399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607 `" && echo ansible-tmp-1727204601.4737947-52834-247421065874607="` echo /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607 `" ) && sleep 0' 46400 1727204601.48201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.48209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.48220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.48236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.48306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.48320: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.48330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.48344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.48357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.48376: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.48393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.48403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.48414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.48422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.48428: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.48439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.48520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.48548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.48585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.48676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.50506: stdout chunk (state=3): >>>ansible-tmp-1727204601.4737947-52834-247421065874607=/root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607 <<< 46400 1727204601.50636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204601.50674: stderr chunk (state=3): >>><<< 46400 1727204601.50678: stdout chunk (state=3): >>><<< 46400 1727204601.50694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204601.4737947-52834-247421065874607=/root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204601.50736: variable 'ansible_module_compression' from source: unknown 46400 1727204601.50777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204601.50802: variable 'ansible_facts' from source: unknown 46400 1727204601.50870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/AnsiballZ_network_connections.py 46400 1727204601.50992: Sending initial data 46400 1727204601.50995: Sent initial data (168 bytes) 46400 1727204601.51895: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.51903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.51913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.51926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.51969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.51977: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.51987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.51999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.52006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.52012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.52020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.52029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.52040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.52052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.52058: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.52070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.52139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.52158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.52175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.52239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.53924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204601.53963: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204601.53993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpij6w3d4h /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/AnsiballZ_network_connections.py <<< 46400 1727204601.54022: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204601.55379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204601.55537: stderr chunk (state=3): >>><<< 46400 1727204601.55540: stdout chunk (state=3): >>><<< 46400 1727204601.55570: done transferring module to remote 46400 1727204601.55576: _low_level_execute_command(): starting 46400 1727204601.55581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/ /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/AnsiballZ_network_connections.py && sleep 0' 46400 1727204601.56193: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.56197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.56218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.56221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.56262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.56268: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.56429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.56433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.56442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.56444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.56447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.56449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.56451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.56453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.56455: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.56457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.56461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.56530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.56535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.56670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.58399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204601.58404: stdout chunk (state=3): >>><<< 46400 1727204601.58406: stderr chunk (state=3): >>><<< 46400 1727204601.58451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204601.58478: _low_level_execute_command(): starting 46400 1727204601.58481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/AnsiballZ_network_connections.py && sleep 0' 46400 1727204601.59431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.59435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.59437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.59439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.59441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.59443: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.59445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.59447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.59449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.59451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.59452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.59454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.59456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.59458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.59463: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.59571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.59689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.59694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.59697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.59699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.83771: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6vck2lk9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6vck2lk9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b: error=unknown <<< 46400 1727204601.83951: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204601.85484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204601.85549: stderr chunk (state=3): >>><<< 46400 1727204601.85553: stdout chunk (state=3): >>><<< 46400 1727204601.85574: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6vck2lk9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6vck2lk9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/02c5cf6c-04c5-4156-9f8b-cbb87fbb0c4b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204601.85611: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204601.85621: _low_level_execute_command(): starting 46400 1727204601.85626: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204601.4737947-52834-247421065874607/ > /dev/null 2>&1 && sleep 0' 46400 1727204601.86257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204601.86271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.86284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.86297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.86334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.86339: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204601.86355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.86381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204601.86385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204601.86387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204601.86390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204601.86392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204601.86400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204601.86407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204601.86413: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204601.86422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204601.86490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204601.86504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204601.86509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204601.86584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204601.88491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204601.88495: stdout chunk (state=3): >>><<< 46400 1727204601.88571: stderr chunk (state=3): >>><<< 46400 1727204601.88576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204601.88579: handler run complete 46400 1727204601.88581: attempt loop complete, returning result 46400 1727204601.88583: _execute() done 46400 1727204601.88585: dumping result to json 46400 1727204601.88587: done dumping result, returning 46400 1727204601.88770: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000001b4b] 46400 1727204601.88773: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4b 46400 1727204601.88855: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4b 46400 1727204601.88861: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 46400 1727204601.88963: no more pending results, returning what we have 46400 1727204601.88969: results queue empty 46400 1727204601.88970: checking for any_errors_fatal 46400 1727204601.88976: done checking for any_errors_fatal 46400 1727204601.88977: checking for max_fail_percentage 46400 1727204601.88979: done checking for max_fail_percentage 46400 1727204601.88979: checking to see if all hosts have failed and the running result is not ok 46400 1727204601.88980: done checking to see if all hosts have failed 46400 1727204601.88981: getting the remaining hosts for this loop 46400 1727204601.88982: done getting the remaining hosts for this loop 46400 1727204601.88986: getting the next task for host managed-node2 46400 1727204601.88993: done getting next task for host managed-node2 46400 1727204601.88997: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204601.89002: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204601.89015: getting variables 46400 1727204601.89017: in VariableManager get_vars() 46400 1727204601.89056: Calling all_inventory to load vars for managed-node2 46400 1727204601.89058: Calling groups_inventory to load vars for managed-node2 46400 1727204601.89060: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204601.89073: Calling all_plugins_play to load vars for managed-node2 46400 1727204601.89075: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204601.89077: Calling groups_plugins_play to load vars for managed-node2 46400 1727204601.90169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204601.91337: done with get_vars() 46400 1727204601.91384: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.680) 0:01:32.199 ***** 46400 1727204601.91483: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204601.91853: worker is 1 (out of 1 available) 46400 1727204601.91871: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204601.91885: done queuing things up, now waiting for results queue to drain 46400 1727204601.91887: waiting for pending results... 46400 1727204601.92216: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204601.92394: in run() - task 0affcd87-79f5-1303-fda8-000000001b4c 46400 1727204601.92404: variable 'ansible_search_path' from source: unknown 46400 1727204601.92407: variable 'ansible_search_path' from source: unknown 46400 1727204601.92446: calling self._execute() 46400 1727204601.92530: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.92540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.92553: variable 'omit' from source: magic vars 46400 1727204601.92842: variable 'ansible_distribution_major_version' from source: facts 46400 1727204601.92852: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204601.92956: variable 'network_state' from source: role '' defaults 46400 1727204601.92967: Evaluated conditional (network_state != {}): False 46400 1727204601.92971: when evaluation is False, skipping this task 46400 1727204601.92975: _execute() done 46400 1727204601.92978: dumping result to json 46400 1727204601.92980: done dumping result, returning 46400 1727204601.92983: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000001b4c] 46400 1727204601.92994: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4c 46400 1727204601.93086: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4c 46400 1727204601.93089: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204601.93140: no more pending results, returning what we have 46400 1727204601.93144: results queue empty 46400 1727204601.93145: checking for any_errors_fatal 46400 1727204601.93165: done checking for any_errors_fatal 46400 1727204601.93166: checking for max_fail_percentage 46400 1727204601.93168: done checking for max_fail_percentage 46400 1727204601.93169: checking to see if all hosts have failed and the running result is not ok 46400 1727204601.93170: done checking to see if all hosts have failed 46400 1727204601.93170: getting the remaining hosts for this loop 46400 1727204601.93172: done getting the remaining hosts for this loop 46400 1727204601.93176: getting the next task for host managed-node2 46400 1727204601.93184: done getting next task for host managed-node2 46400 1727204601.93188: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204601.93194: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204601.93217: getting variables 46400 1727204601.93218: in VariableManager get_vars() 46400 1727204601.93256: Calling all_inventory to load vars for managed-node2 46400 1727204601.93261: Calling groups_inventory to load vars for managed-node2 46400 1727204601.93265: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204601.93275: Calling all_plugins_play to load vars for managed-node2 46400 1727204601.93278: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204601.93280: Calling groups_plugins_play to load vars for managed-node2 46400 1727204601.94121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204601.96102: done with get_vars() 46400 1727204601.96130: done getting variables 46400 1727204601.96203: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.047) 0:01:32.246 ***** 46400 1727204601.96243: entering _queue_task() for managed-node2/debug 46400 1727204601.96774: worker is 1 (out of 1 available) 46400 1727204601.96791: exiting _queue_task() for managed-node2/debug 46400 1727204601.96808: done queuing things up, now waiting for results queue to drain 46400 1727204601.96810: waiting for pending results... 46400 1727204601.97199: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204601.97399: in run() - task 0affcd87-79f5-1303-fda8-000000001b4d 46400 1727204601.97423: variable 'ansible_search_path' from source: unknown 46400 1727204601.97431: variable 'ansible_search_path' from source: unknown 46400 1727204601.97486: calling self._execute() 46400 1727204601.97604: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.97617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.97630: variable 'omit' from source: magic vars 46400 1727204601.98082: variable 'ansible_distribution_major_version' from source: facts 46400 1727204601.98107: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204601.98127: variable 'omit' from source: magic vars 46400 1727204601.98202: variable 'omit' from source: magic vars 46400 1727204601.98270: variable 'omit' from source: magic vars 46400 1727204601.98349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204601.98416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204601.98465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204601.98490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204601.98505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204601.98541: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204601.98557: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.98567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.98679: Set connection var ansible_shell_type to sh 46400 1727204601.98693: Set connection var ansible_shell_executable to /bin/sh 46400 1727204601.98703: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204601.98711: Set connection var ansible_connection to ssh 46400 1727204601.98724: Set connection var ansible_pipelining to False 46400 1727204601.98735: Set connection var ansible_timeout to 10 46400 1727204601.98782: variable 'ansible_shell_executable' from source: unknown 46400 1727204601.98790: variable 'ansible_connection' from source: unknown 46400 1727204601.98797: variable 'ansible_module_compression' from source: unknown 46400 1727204601.98803: variable 'ansible_shell_type' from source: unknown 46400 1727204601.98809: variable 'ansible_shell_executable' from source: unknown 46400 1727204601.98814: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204601.98820: variable 'ansible_pipelining' from source: unknown 46400 1727204601.98826: variable 'ansible_timeout' from source: unknown 46400 1727204601.98832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204601.98998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204601.99015: variable 'omit' from source: magic vars 46400 1727204601.99030: starting attempt loop 46400 1727204601.99038: running the handler 46400 1727204601.99204: variable '__network_connections_result' from source: set_fact 46400 1727204601.99269: handler run complete 46400 1727204601.99340: attempt loop complete, returning result 46400 1727204601.99347: _execute() done 46400 1727204601.99352: dumping result to json 46400 1727204601.99357: done dumping result, returning 46400 1727204601.99374: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000001b4d] 46400 1727204601.99386: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4d 46400 1727204601.99524: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4d ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 46400 1727204601.99609: no more pending results, returning what we have 46400 1727204601.99614: results queue empty 46400 1727204601.99615: checking for any_errors_fatal 46400 1727204601.99624: done checking for any_errors_fatal 46400 1727204601.99625: checking for max_fail_percentage 46400 1727204601.99627: done checking for max_fail_percentage 46400 1727204601.99628: checking to see if all hosts have failed and the running result is not ok 46400 1727204601.99629: done checking to see if all hosts have failed 46400 1727204601.99630: getting the remaining hosts for this loop 46400 1727204601.99632: done getting the remaining hosts for this loop 46400 1727204601.99637: getting the next task for host managed-node2 46400 1727204601.99647: done getting next task for host managed-node2 46400 1727204601.99652: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204601.99660: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204601.99678: getting variables 46400 1727204601.99680: in VariableManager get_vars() 46400 1727204601.99734: Calling all_inventory to load vars for managed-node2 46400 1727204601.99737: Calling groups_inventory to load vars for managed-node2 46400 1727204601.99739: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204601.99752: Calling all_plugins_play to load vars for managed-node2 46400 1727204601.99755: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204601.99758: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.00851: WORKER PROCESS EXITING 46400 1727204602.01884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.04236: done with get_vars() 46400 1727204602.04279: done getting variables 46400 1727204602.04346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.081) 0:01:32.328 ***** 46400 1727204602.04396: entering _queue_task() for managed-node2/debug 46400 1727204602.05055: worker is 1 (out of 1 available) 46400 1727204602.05086: exiting _queue_task() for managed-node2/debug 46400 1727204602.05111: done queuing things up, now waiting for results queue to drain 46400 1727204602.05114: waiting for pending results... 46400 1727204602.05529: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204602.05657: in run() - task 0affcd87-79f5-1303-fda8-000000001b4e 46400 1727204602.05671: variable 'ansible_search_path' from source: unknown 46400 1727204602.05675: variable 'ansible_search_path' from source: unknown 46400 1727204602.05712: calling self._execute() 46400 1727204602.05793: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.05797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.05805: variable 'omit' from source: magic vars 46400 1727204602.06099: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.06109: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.06115: variable 'omit' from source: magic vars 46400 1727204602.06168: variable 'omit' from source: magic vars 46400 1727204602.06191: variable 'omit' from source: magic vars 46400 1727204602.06226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204602.06256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204602.06277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204602.06291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204602.06301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204602.06325: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204602.06328: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.06332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.06403: Set connection var ansible_shell_type to sh 46400 1727204602.06411: Set connection var ansible_shell_executable to /bin/sh 46400 1727204602.06416: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204602.06421: Set connection var ansible_connection to ssh 46400 1727204602.06427: Set connection var ansible_pipelining to False 46400 1727204602.06431: Set connection var ansible_timeout to 10 46400 1727204602.06451: variable 'ansible_shell_executable' from source: unknown 46400 1727204602.06454: variable 'ansible_connection' from source: unknown 46400 1727204602.06457: variable 'ansible_module_compression' from source: unknown 46400 1727204602.06463: variable 'ansible_shell_type' from source: unknown 46400 1727204602.06466: variable 'ansible_shell_executable' from source: unknown 46400 1727204602.06468: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.06470: variable 'ansible_pipelining' from source: unknown 46400 1727204602.06474: variable 'ansible_timeout' from source: unknown 46400 1727204602.06477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.06580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204602.06594: variable 'omit' from source: magic vars 46400 1727204602.06597: starting attempt loop 46400 1727204602.06599: running the handler 46400 1727204602.06634: variable '__network_connections_result' from source: set_fact 46400 1727204602.06697: variable '__network_connections_result' from source: set_fact 46400 1727204602.06773: handler run complete 46400 1727204602.06791: attempt loop complete, returning result 46400 1727204602.06794: _execute() done 46400 1727204602.06797: dumping result to json 46400 1727204602.06800: done dumping result, returning 46400 1727204602.06808: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000001b4e] 46400 1727204602.06816: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4e 46400 1727204602.06907: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4e 46400 1727204602.06909: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 46400 1727204602.07001: no more pending results, returning what we have 46400 1727204602.07005: results queue empty 46400 1727204602.07006: checking for any_errors_fatal 46400 1727204602.07014: done checking for any_errors_fatal 46400 1727204602.07015: checking for max_fail_percentage 46400 1727204602.07017: done checking for max_fail_percentage 46400 1727204602.07018: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.07018: done checking to see if all hosts have failed 46400 1727204602.07019: getting the remaining hosts for this loop 46400 1727204602.07021: done getting the remaining hosts for this loop 46400 1727204602.07025: getting the next task for host managed-node2 46400 1727204602.07032: done getting next task for host managed-node2 46400 1727204602.07037: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204602.07041: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.07054: getting variables 46400 1727204602.07055: in VariableManager get_vars() 46400 1727204602.07103: Calling all_inventory to load vars for managed-node2 46400 1727204602.07106: Calling groups_inventory to load vars for managed-node2 46400 1727204602.07108: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.07117: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.07120: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.07122: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.08143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.09067: done with get_vars() 46400 1727204602.09085: done getting variables 46400 1727204602.09132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.047) 0:01:32.376 ***** 46400 1727204602.09162: entering _queue_task() for managed-node2/debug 46400 1727204602.09418: worker is 1 (out of 1 available) 46400 1727204602.09432: exiting _queue_task() for managed-node2/debug 46400 1727204602.09445: done queuing things up, now waiting for results queue to drain 46400 1727204602.09447: waiting for pending results... 46400 1727204602.09647: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204602.09767: in run() - task 0affcd87-79f5-1303-fda8-000000001b4f 46400 1727204602.09778: variable 'ansible_search_path' from source: unknown 46400 1727204602.09781: variable 'ansible_search_path' from source: unknown 46400 1727204602.09812: calling self._execute() 46400 1727204602.09900: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.09905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.09908: variable 'omit' from source: magic vars 46400 1727204602.10190: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.10200: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.10291: variable 'network_state' from source: role '' defaults 46400 1727204602.10299: Evaluated conditional (network_state != {}): False 46400 1727204602.10302: when evaluation is False, skipping this task 46400 1727204602.10305: _execute() done 46400 1727204602.10308: dumping result to json 46400 1727204602.10310: done dumping result, returning 46400 1727204602.10316: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000001b4f] 46400 1727204602.10322: sending task result for task 0affcd87-79f5-1303-fda8-000000001b4f 46400 1727204602.10419: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b4f 46400 1727204602.10422: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204602.10498: no more pending results, returning what we have 46400 1727204602.10502: results queue empty 46400 1727204602.10503: checking for any_errors_fatal 46400 1727204602.10511: done checking for any_errors_fatal 46400 1727204602.10512: checking for max_fail_percentage 46400 1727204602.10514: done checking for max_fail_percentage 46400 1727204602.10514: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.10515: done checking to see if all hosts have failed 46400 1727204602.10516: getting the remaining hosts for this loop 46400 1727204602.10518: done getting the remaining hosts for this loop 46400 1727204602.10522: getting the next task for host managed-node2 46400 1727204602.10529: done getting next task for host managed-node2 46400 1727204602.10534: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204602.10538: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.10572: getting variables 46400 1727204602.10574: in VariableManager get_vars() 46400 1727204602.10609: Calling all_inventory to load vars for managed-node2 46400 1727204602.10612: Calling groups_inventory to load vars for managed-node2 46400 1727204602.10613: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.10622: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.10625: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.10627: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.11432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.12372: done with get_vars() 46400 1727204602.12389: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.032) 0:01:32.409 ***** 46400 1727204602.12461: entering _queue_task() for managed-node2/ping 46400 1727204602.12698: worker is 1 (out of 1 available) 46400 1727204602.12712: exiting _queue_task() for managed-node2/ping 46400 1727204602.12725: done queuing things up, now waiting for results queue to drain 46400 1727204602.12727: waiting for pending results... 46400 1727204602.12931: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204602.13038: in run() - task 0affcd87-79f5-1303-fda8-000000001b50 46400 1727204602.13049: variable 'ansible_search_path' from source: unknown 46400 1727204602.13058: variable 'ansible_search_path' from source: unknown 46400 1727204602.13090: calling self._execute() 46400 1727204602.13166: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.13175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.13183: variable 'omit' from source: magic vars 46400 1727204602.13468: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.13477: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.13485: variable 'omit' from source: magic vars 46400 1727204602.13533: variable 'omit' from source: magic vars 46400 1727204602.13557: variable 'omit' from source: magic vars 46400 1727204602.13596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204602.13625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204602.13644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204602.13658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204602.13670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204602.13694: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204602.13697: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.13699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.13768: Set connection var ansible_shell_type to sh 46400 1727204602.13778: Set connection var ansible_shell_executable to /bin/sh 46400 1727204602.13782: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204602.13787: Set connection var ansible_connection to ssh 46400 1727204602.13792: Set connection var ansible_pipelining to False 46400 1727204602.13797: Set connection var ansible_timeout to 10 46400 1727204602.13817: variable 'ansible_shell_executable' from source: unknown 46400 1727204602.13820: variable 'ansible_connection' from source: unknown 46400 1727204602.13823: variable 'ansible_module_compression' from source: unknown 46400 1727204602.13826: variable 'ansible_shell_type' from source: unknown 46400 1727204602.13829: variable 'ansible_shell_executable' from source: unknown 46400 1727204602.13831: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.13833: variable 'ansible_pipelining' from source: unknown 46400 1727204602.13835: variable 'ansible_timeout' from source: unknown 46400 1727204602.13838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.13989: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204602.13997: variable 'omit' from source: magic vars 46400 1727204602.14001: starting attempt loop 46400 1727204602.14004: running the handler 46400 1727204602.14015: _low_level_execute_command(): starting 46400 1727204602.14022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204602.14563: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.14581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.14596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204602.14621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.14671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.14683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.14734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.16384: stdout chunk (state=3): >>>/root <<< 46400 1727204602.16491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204602.16545: stderr chunk (state=3): >>><<< 46400 1727204602.16555: stdout chunk (state=3): >>><<< 46400 1727204602.16580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204602.16593: _low_level_execute_command(): starting 46400 1727204602.16599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602 `" && echo ansible-tmp-1727204602.1658118-52951-191267707150602="` echo /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602 `" ) && sleep 0' 46400 1727204602.17057: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.17068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.17104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204602.17116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.17174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.17184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.17235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.19094: stdout chunk (state=3): >>>ansible-tmp-1727204602.1658118-52951-191267707150602=/root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602 <<< 46400 1727204602.19206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204602.19263: stderr chunk (state=3): >>><<< 46400 1727204602.19274: stdout chunk (state=3): >>><<< 46400 1727204602.19292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204602.1658118-52951-191267707150602=/root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204602.19330: variable 'ansible_module_compression' from source: unknown 46400 1727204602.19367: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204602.19395: variable 'ansible_facts' from source: unknown 46400 1727204602.19451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/AnsiballZ_ping.py 46400 1727204602.19555: Sending initial data 46400 1727204602.19559: Sent initial data (153 bytes) 46400 1727204602.20243: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204602.20249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.20286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.20299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.20356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.20361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.20441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.22149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204602.22186: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204602.22218: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp6bymh1fl /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/AnsiballZ_ping.py <<< 46400 1727204602.22260: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204602.23274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204602.23376: stderr chunk (state=3): >>><<< 46400 1727204602.23380: stdout chunk (state=3): >>><<< 46400 1727204602.23398: done transferring module to remote 46400 1727204602.23407: _low_level_execute_command(): starting 46400 1727204602.23413: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/ /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/AnsiballZ_ping.py && sleep 0' 46400 1727204602.23867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.23871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.23920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.23923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204602.23926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.23931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.23982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204602.23985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.24001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.24049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.25797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204602.25881: stderr chunk (state=3): >>><<< 46400 1727204602.25884: stdout chunk (state=3): >>><<< 46400 1727204602.25903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204602.25910: _low_level_execute_command(): starting 46400 1727204602.25913: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/AnsiballZ_ping.py && sleep 0' 46400 1727204602.26558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204602.26568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204602.26582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.26593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.26634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204602.26641: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204602.26651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.26666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204602.26675: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204602.26682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204602.26690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204602.26699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.26711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.26718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204602.26725: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204602.26735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.26809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204602.26823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.26829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.26908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.40052: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204602.41162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204602.41169: stdout chunk (state=3): >>><<< 46400 1727204602.41172: stderr chunk (state=3): >>><<< 46400 1727204602.41308: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204602.41312: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204602.41314: _low_level_execute_command(): starting 46400 1727204602.41316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204602.1658118-52951-191267707150602/ > /dev/null 2>&1 && sleep 0' 46400 1727204602.42038: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204602.42042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.42082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204602.42087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204602.42090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204602.42149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204602.42168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204602.42242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204602.44068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204602.44170: stderr chunk (state=3): >>><<< 46400 1727204602.44184: stdout chunk (state=3): >>><<< 46400 1727204602.44271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204602.44280: handler run complete 46400 1727204602.44282: attempt loop complete, returning result 46400 1727204602.44284: _execute() done 46400 1727204602.44286: dumping result to json 46400 1727204602.44289: done dumping result, returning 46400 1727204602.44291: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000001b50] 46400 1727204602.44293: sending task result for task 0affcd87-79f5-1303-fda8-000000001b50 46400 1727204602.44551: done sending task result for task 0affcd87-79f5-1303-fda8-000000001b50 46400 1727204602.44554: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204602.44668: no more pending results, returning what we have 46400 1727204602.44673: results queue empty 46400 1727204602.44674: checking for any_errors_fatal 46400 1727204602.44684: done checking for any_errors_fatal 46400 1727204602.44685: checking for max_fail_percentage 46400 1727204602.44687: done checking for max_fail_percentage 46400 1727204602.44688: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.44689: done checking to see if all hosts have failed 46400 1727204602.44690: getting the remaining hosts for this loop 46400 1727204602.44692: done getting the remaining hosts for this loop 46400 1727204602.44696: getting the next task for host managed-node2 46400 1727204602.44709: done getting next task for host managed-node2 46400 1727204602.44712: ^ task is: TASK: meta (role_complete) 46400 1727204602.44718: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.44735: getting variables 46400 1727204602.44737: in VariableManager get_vars() 46400 1727204602.44817: Calling all_inventory to load vars for managed-node2 46400 1727204602.44820: Calling groups_inventory to load vars for managed-node2 46400 1727204602.44823: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.44835: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.44839: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.44843: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.47004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.48814: done with get_vars() 46400 1727204602.48851: done getting variables 46400 1727204602.48947: done queuing things up, now waiting for results queue to drain 46400 1727204602.48949: results queue empty 46400 1727204602.48950: checking for any_errors_fatal 46400 1727204602.48953: done checking for any_errors_fatal 46400 1727204602.48954: checking for max_fail_percentage 46400 1727204602.48955: done checking for max_fail_percentage 46400 1727204602.48956: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.48957: done checking to see if all hosts have failed 46400 1727204602.48957: getting the remaining hosts for this loop 46400 1727204602.48958: done getting the remaining hosts for this loop 46400 1727204602.48965: getting the next task for host managed-node2 46400 1727204602.48971: done getting next task for host managed-node2 46400 1727204602.48974: ^ task is: TASK: Test 46400 1727204602.48976: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.48979: getting variables 46400 1727204602.48980: in VariableManager get_vars() 46400 1727204602.48994: Calling all_inventory to load vars for managed-node2 46400 1727204602.48997: Calling groups_inventory to load vars for managed-node2 46400 1727204602.48999: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.49005: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.49007: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.49010: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.50321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.52410: done with get_vars() 46400 1727204602.52443: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.400) 0:01:32.809 ***** 46400 1727204602.52528: entering _queue_task() for managed-node2/include_tasks 46400 1727204602.52904: worker is 1 (out of 1 available) 46400 1727204602.52918: exiting _queue_task() for managed-node2/include_tasks 46400 1727204602.52932: done queuing things up, now waiting for results queue to drain 46400 1727204602.52933: waiting for pending results... 46400 1727204602.53252: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204602.53401: in run() - task 0affcd87-79f5-1303-fda8-000000001748 46400 1727204602.53421: variable 'ansible_search_path' from source: unknown 46400 1727204602.53433: variable 'ansible_search_path' from source: unknown 46400 1727204602.53492: variable 'lsr_test' from source: include params 46400 1727204602.53724: variable 'lsr_test' from source: include params 46400 1727204602.53803: variable 'omit' from source: magic vars 46400 1727204602.53965: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.53985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.53999: variable 'omit' from source: magic vars 46400 1727204602.54268: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.54282: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.54293: variable 'item' from source: unknown 46400 1727204602.54373: variable 'item' from source: unknown 46400 1727204602.54419: variable 'item' from source: unknown 46400 1727204602.54493: variable 'item' from source: unknown 46400 1727204602.54653: dumping result to json 46400 1727204602.54669: done dumping result, returning 46400 1727204602.54679: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-000000001748] 46400 1727204602.54690: sending task result for task 0affcd87-79f5-1303-fda8-000000001748 46400 1727204602.54786: no more pending results, returning what we have 46400 1727204602.54791: in VariableManager get_vars() 46400 1727204602.54843: Calling all_inventory to load vars for managed-node2 46400 1727204602.54846: Calling groups_inventory to load vars for managed-node2 46400 1727204602.54849: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.54868: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.54872: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.54875: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.55924: done sending task result for task 0affcd87-79f5-1303-fda8-000000001748 46400 1727204602.55928: WORKER PROCESS EXITING 46400 1727204602.57094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.59106: done with get_vars() 46400 1727204602.59140: variable 'ansible_search_path' from source: unknown 46400 1727204602.59141: variable 'ansible_search_path' from source: unknown 46400 1727204602.59190: we have included files to process 46400 1727204602.59191: generating all_blocks data 46400 1727204602.59194: done generating all_blocks data 46400 1727204602.59200: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204602.59201: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204602.59204: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204602.59423: done processing included file 46400 1727204602.59425: iterating over new_blocks loaded from include file 46400 1727204602.59426: in VariableManager get_vars() 46400 1727204602.59444: done with get_vars() 46400 1727204602.59446: filtering new block on tags 46400 1727204602.59483: done filtering new block on tags 46400 1727204602.59485: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 46400 1727204602.59491: extending task lists for all hosts with included blocks 46400 1727204602.60531: done extending task lists 46400 1727204602.60533: done processing included files 46400 1727204602.60534: results queue empty 46400 1727204602.60534: checking for any_errors_fatal 46400 1727204602.60536: done checking for any_errors_fatal 46400 1727204602.60537: checking for max_fail_percentage 46400 1727204602.60538: done checking for max_fail_percentage 46400 1727204602.60543: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.60544: done checking to see if all hosts have failed 46400 1727204602.60544: getting the remaining hosts for this loop 46400 1727204602.60546: done getting the remaining hosts for this loop 46400 1727204602.60548: getting the next task for host managed-node2 46400 1727204602.60553: done getting next task for host managed-node2 46400 1727204602.60555: ^ task is: TASK: Include network role 46400 1727204602.60558: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.60562: getting variables 46400 1727204602.60565: in VariableManager get_vars() 46400 1727204602.60578: Calling all_inventory to load vars for managed-node2 46400 1727204602.60581: Calling groups_inventory to load vars for managed-node2 46400 1727204602.60583: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.60589: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.60592: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.60595: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.61996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.64122: done with get_vars() 46400 1727204602.64158: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.117) 0:01:32.927 ***** 46400 1727204602.64270: entering _queue_task() for managed-node2/include_role 46400 1727204602.64669: worker is 1 (out of 1 available) 46400 1727204602.64681: exiting _queue_task() for managed-node2/include_role 46400 1727204602.64693: done queuing things up, now waiting for results queue to drain 46400 1727204602.64694: waiting for pending results... 46400 1727204602.65003: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204602.65151: in run() - task 0affcd87-79f5-1303-fda8-000000001ca9 46400 1727204602.65178: variable 'ansible_search_path' from source: unknown 46400 1727204602.65188: variable 'ansible_search_path' from source: unknown 46400 1727204602.65227: calling self._execute() 46400 1727204602.65336: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.65349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.65375: variable 'omit' from source: magic vars 46400 1727204602.65788: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.65811: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.65821: _execute() done 46400 1727204602.65828: dumping result to json 46400 1727204602.65835: done dumping result, returning 46400 1727204602.65843: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000001ca9] 46400 1727204602.65852: sending task result for task 0affcd87-79f5-1303-fda8-000000001ca9 46400 1727204602.66011: no more pending results, returning what we have 46400 1727204602.66017: in VariableManager get_vars() 46400 1727204602.66075: Calling all_inventory to load vars for managed-node2 46400 1727204602.66078: Calling groups_inventory to load vars for managed-node2 46400 1727204602.66082: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.66096: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.66100: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.66103: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.67233: done sending task result for task 0affcd87-79f5-1303-fda8-000000001ca9 46400 1727204602.67237: WORKER PROCESS EXITING 46400 1727204602.67925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.70090: done with get_vars() 46400 1727204602.70128: variable 'ansible_search_path' from source: unknown 46400 1727204602.70129: variable 'ansible_search_path' from source: unknown 46400 1727204602.70302: variable 'omit' from source: magic vars 46400 1727204602.70356: variable 'omit' from source: magic vars 46400 1727204602.70375: variable 'omit' from source: magic vars 46400 1727204602.70379: we have included files to process 46400 1727204602.70380: generating all_blocks data 46400 1727204602.70382: done generating all_blocks data 46400 1727204602.70383: processing included file: fedora.linux_system_roles.network 46400 1727204602.70401: in VariableManager get_vars() 46400 1727204602.70416: done with get_vars() 46400 1727204602.70447: in VariableManager get_vars() 46400 1727204602.70469: done with get_vars() 46400 1727204602.70506: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204602.70630: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204602.70725: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204602.71233: in VariableManager get_vars() 46400 1727204602.71255: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204602.73343: iterating over new_blocks loaded from include file 46400 1727204602.73346: in VariableManager get_vars() 46400 1727204602.73370: done with get_vars() 46400 1727204602.73373: filtering new block on tags 46400 1727204602.73749: done filtering new block on tags 46400 1727204602.73752: in VariableManager get_vars() 46400 1727204602.73774: done with get_vars() 46400 1727204602.73776: filtering new block on tags 46400 1727204602.73793: done filtering new block on tags 46400 1727204602.73796: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204602.73801: extending task lists for all hosts with included blocks 46400 1727204602.73929: done extending task lists 46400 1727204602.73931: done processing included files 46400 1727204602.73931: results queue empty 46400 1727204602.73932: checking for any_errors_fatal 46400 1727204602.73936: done checking for any_errors_fatal 46400 1727204602.73937: checking for max_fail_percentage 46400 1727204602.73938: done checking for max_fail_percentage 46400 1727204602.73939: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.73940: done checking to see if all hosts have failed 46400 1727204602.73941: getting the remaining hosts for this loop 46400 1727204602.73942: done getting the remaining hosts for this loop 46400 1727204602.73945: getting the next task for host managed-node2 46400 1727204602.73950: done getting next task for host managed-node2 46400 1727204602.73953: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204602.73957: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.73973: getting variables 46400 1727204602.73974: in VariableManager get_vars() 46400 1727204602.73988: Calling all_inventory to load vars for managed-node2 46400 1727204602.73991: Calling groups_inventory to load vars for managed-node2 46400 1727204602.73993: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.73999: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.74001: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.74004: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.75531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.77319: done with get_vars() 46400 1727204602.77351: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.131) 0:01:33.059 ***** 46400 1727204602.77445: entering _queue_task() for managed-node2/include_tasks 46400 1727204602.77830: worker is 1 (out of 1 available) 46400 1727204602.77842: exiting _queue_task() for managed-node2/include_tasks 46400 1727204602.77863: done queuing things up, now waiting for results queue to drain 46400 1727204602.77865: waiting for pending results... 46400 1727204602.78178: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204602.78358: in run() - task 0affcd87-79f5-1303-fda8-000000001d2b 46400 1727204602.78386: variable 'ansible_search_path' from source: unknown 46400 1727204602.78396: variable 'ansible_search_path' from source: unknown 46400 1727204602.78448: calling self._execute() 46400 1727204602.78563: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.78579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.78595: variable 'omit' from source: magic vars 46400 1727204602.79016: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.79033: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.79044: _execute() done 46400 1727204602.79052: dumping result to json 46400 1727204602.79069: done dumping result, returning 46400 1727204602.79081: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000001d2b] 46400 1727204602.79092: sending task result for task 0affcd87-79f5-1303-fda8-000000001d2b 46400 1727204602.79258: no more pending results, returning what we have 46400 1727204602.79268: in VariableManager get_vars() 46400 1727204602.79324: Calling all_inventory to load vars for managed-node2 46400 1727204602.79327: Calling groups_inventory to load vars for managed-node2 46400 1727204602.79330: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.79343: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.79346: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.79349: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.80412: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d2b 46400 1727204602.80416: WORKER PROCESS EXITING 46400 1727204602.81373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.83183: done with get_vars() 46400 1727204602.83208: variable 'ansible_search_path' from source: unknown 46400 1727204602.83209: variable 'ansible_search_path' from source: unknown 46400 1727204602.83251: we have included files to process 46400 1727204602.83253: generating all_blocks data 46400 1727204602.83254: done generating all_blocks data 46400 1727204602.83258: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204602.83261: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204602.83263: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204602.83963: done processing included file 46400 1727204602.83968: iterating over new_blocks loaded from include file 46400 1727204602.83970: in VariableManager get_vars() 46400 1727204602.84000: done with get_vars() 46400 1727204602.84002: filtering new block on tags 46400 1727204602.84047: done filtering new block on tags 46400 1727204602.84056: in VariableManager get_vars() 46400 1727204602.84094: done with get_vars() 46400 1727204602.84096: filtering new block on tags 46400 1727204602.84150: done filtering new block on tags 46400 1727204602.84153: in VariableManager get_vars() 46400 1727204602.84185: done with get_vars() 46400 1727204602.84187: filtering new block on tags 46400 1727204602.84238: done filtering new block on tags 46400 1727204602.84240: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204602.84246: extending task lists for all hosts with included blocks 46400 1727204602.86177: done extending task lists 46400 1727204602.86179: done processing included files 46400 1727204602.86180: results queue empty 46400 1727204602.86180: checking for any_errors_fatal 46400 1727204602.86187: done checking for any_errors_fatal 46400 1727204602.86188: checking for max_fail_percentage 46400 1727204602.86190: done checking for max_fail_percentage 46400 1727204602.86191: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.86192: done checking to see if all hosts have failed 46400 1727204602.86193: getting the remaining hosts for this loop 46400 1727204602.86194: done getting the remaining hosts for this loop 46400 1727204602.86197: getting the next task for host managed-node2 46400 1727204602.86202: done getting next task for host managed-node2 46400 1727204602.86205: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204602.86209: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.86220: getting variables 46400 1727204602.86221: in VariableManager get_vars() 46400 1727204602.86239: Calling all_inventory to load vars for managed-node2 46400 1727204602.86241: Calling groups_inventory to load vars for managed-node2 46400 1727204602.86243: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.86248: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.86251: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.86253: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.87645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204602.89567: done with get_vars() 46400 1727204602.89599: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.122) 0:01:33.181 ***** 46400 1727204602.89689: entering _queue_task() for managed-node2/setup 46400 1727204602.90061: worker is 1 (out of 1 available) 46400 1727204602.90080: exiting _queue_task() for managed-node2/setup 46400 1727204602.90095: done queuing things up, now waiting for results queue to drain 46400 1727204602.90097: waiting for pending results... 46400 1727204602.90414: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204602.90607: in run() - task 0affcd87-79f5-1303-fda8-000000001d82 46400 1727204602.90628: variable 'ansible_search_path' from source: unknown 46400 1727204602.90639: variable 'ansible_search_path' from source: unknown 46400 1727204602.90690: calling self._execute() 46400 1727204602.90795: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204602.90806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204602.90819: variable 'omit' from source: magic vars 46400 1727204602.91214: variable 'ansible_distribution_major_version' from source: facts 46400 1727204602.91230: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204602.91470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204602.94091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204602.94180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204602.94233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204602.94293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204602.94326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204602.94441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204602.94484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204602.94561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204602.94689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204602.94709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204602.94780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204602.94810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204602.94840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204602.94898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204602.94917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204602.95105: variable '__network_required_facts' from source: role '' defaults 46400 1727204602.95119: variable 'ansible_facts' from source: unknown 46400 1727204602.95925: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204602.95934: when evaluation is False, skipping this task 46400 1727204602.95941: _execute() done 46400 1727204602.95954: dumping result to json 46400 1727204602.95965: done dumping result, returning 46400 1727204602.95976: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000001d82] 46400 1727204602.95986: sending task result for task 0affcd87-79f5-1303-fda8-000000001d82 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204602.96137: no more pending results, returning what we have 46400 1727204602.96143: results queue empty 46400 1727204602.96144: checking for any_errors_fatal 46400 1727204602.96146: done checking for any_errors_fatal 46400 1727204602.96146: checking for max_fail_percentage 46400 1727204602.96148: done checking for max_fail_percentage 46400 1727204602.96149: checking to see if all hosts have failed and the running result is not ok 46400 1727204602.96150: done checking to see if all hosts have failed 46400 1727204602.96151: getting the remaining hosts for this loop 46400 1727204602.96153: done getting the remaining hosts for this loop 46400 1727204602.96158: getting the next task for host managed-node2 46400 1727204602.96176: done getting next task for host managed-node2 46400 1727204602.96180: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204602.96186: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204602.96217: getting variables 46400 1727204602.96219: in VariableManager get_vars() 46400 1727204602.96270: Calling all_inventory to load vars for managed-node2 46400 1727204602.96273: Calling groups_inventory to load vars for managed-node2 46400 1727204602.96275: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204602.96287: Calling all_plugins_play to load vars for managed-node2 46400 1727204602.96290: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204602.96292: Calling groups_plugins_play to load vars for managed-node2 46400 1727204602.97310: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d82 46400 1727204602.97321: WORKER PROCESS EXITING 46400 1727204602.98263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204603.00031: done with get_vars() 46400 1727204603.00075: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.105) 0:01:33.286 ***** 46400 1727204603.00197: entering _queue_task() for managed-node2/stat 46400 1727204603.00580: worker is 1 (out of 1 available) 46400 1727204603.00599: exiting _queue_task() for managed-node2/stat 46400 1727204603.00612: done queuing things up, now waiting for results queue to drain 46400 1727204603.00613: waiting for pending results... 46400 1727204603.00950: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204603.01143: in run() - task 0affcd87-79f5-1303-fda8-000000001d84 46400 1727204603.01170: variable 'ansible_search_path' from source: unknown 46400 1727204603.01182: variable 'ansible_search_path' from source: unknown 46400 1727204603.01227: calling self._execute() 46400 1727204603.01348: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204603.01370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204603.01386: variable 'omit' from source: magic vars 46400 1727204603.01774: variable 'ansible_distribution_major_version' from source: facts 46400 1727204603.01795: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204603.01982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204603.02305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204603.02369: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204603.02414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204603.02462: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204603.02572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204603.02608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204603.02641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204603.02684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204603.02796: variable '__network_is_ostree' from source: set_fact 46400 1727204603.02810: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204603.02821: when evaluation is False, skipping this task 46400 1727204603.02827: _execute() done 46400 1727204603.02834: dumping result to json 46400 1727204603.02841: done dumping result, returning 46400 1727204603.02852: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000001d84] 46400 1727204603.02867: sending task result for task 0affcd87-79f5-1303-fda8-000000001d84 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204603.03044: no more pending results, returning what we have 46400 1727204603.03049: results queue empty 46400 1727204603.03050: checking for any_errors_fatal 46400 1727204603.03062: done checking for any_errors_fatal 46400 1727204603.03065: checking for max_fail_percentage 46400 1727204603.03067: done checking for max_fail_percentage 46400 1727204603.03068: checking to see if all hosts have failed and the running result is not ok 46400 1727204603.03069: done checking to see if all hosts have failed 46400 1727204603.03069: getting the remaining hosts for this loop 46400 1727204603.03071: done getting the remaining hosts for this loop 46400 1727204603.03076: getting the next task for host managed-node2 46400 1727204603.03086: done getting next task for host managed-node2 46400 1727204603.03090: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204603.03096: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204603.03123: getting variables 46400 1727204603.03125: in VariableManager get_vars() 46400 1727204603.03178: Calling all_inventory to load vars for managed-node2 46400 1727204603.03181: Calling groups_inventory to load vars for managed-node2 46400 1727204603.03183: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204603.03194: Calling all_plugins_play to load vars for managed-node2 46400 1727204603.03197: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204603.03200: Calling groups_plugins_play to load vars for managed-node2 46400 1727204603.04236: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d84 46400 1727204603.04240: WORKER PROCESS EXITING 46400 1727204603.05014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204603.07086: done with get_vars() 46400 1727204603.07112: done getting variables 46400 1727204603.07184: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.070) 0:01:33.356 ***** 46400 1727204603.07224: entering _queue_task() for managed-node2/set_fact 46400 1727204603.07595: worker is 1 (out of 1 available) 46400 1727204603.07611: exiting _queue_task() for managed-node2/set_fact 46400 1727204603.07625: done queuing things up, now waiting for results queue to drain 46400 1727204603.07627: waiting for pending results... 46400 1727204603.07936: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204603.08124: in run() - task 0affcd87-79f5-1303-fda8-000000001d85 46400 1727204603.08143: variable 'ansible_search_path' from source: unknown 46400 1727204603.08150: variable 'ansible_search_path' from source: unknown 46400 1727204603.08201: calling self._execute() 46400 1727204603.08309: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204603.08322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204603.08336: variable 'omit' from source: magic vars 46400 1727204603.08735: variable 'ansible_distribution_major_version' from source: facts 46400 1727204603.08753: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204603.08929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204603.09222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204603.09290: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204603.09332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204603.09384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204603.09490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204603.09522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204603.09555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204603.09601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204603.09711: variable '__network_is_ostree' from source: set_fact 46400 1727204603.09724: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204603.09732: when evaluation is False, skipping this task 46400 1727204603.09739: _execute() done 46400 1727204603.09746: dumping result to json 46400 1727204603.09752: done dumping result, returning 46400 1727204603.09770: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000001d85] 46400 1727204603.09781: sending task result for task 0affcd87-79f5-1303-fda8-000000001d85 46400 1727204603.09910: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d85 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204603.09968: no more pending results, returning what we have 46400 1727204603.09974: results queue empty 46400 1727204603.09975: checking for any_errors_fatal 46400 1727204603.09985: done checking for any_errors_fatal 46400 1727204603.09986: checking for max_fail_percentage 46400 1727204603.09989: done checking for max_fail_percentage 46400 1727204603.09990: checking to see if all hosts have failed and the running result is not ok 46400 1727204603.09991: done checking to see if all hosts have failed 46400 1727204603.09992: getting the remaining hosts for this loop 46400 1727204603.09994: done getting the remaining hosts for this loop 46400 1727204603.09998: getting the next task for host managed-node2 46400 1727204603.10014: done getting next task for host managed-node2 46400 1727204603.10019: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204603.10025: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204603.10056: getting variables 46400 1727204603.10058: in VariableManager get_vars() 46400 1727204603.10114: Calling all_inventory to load vars for managed-node2 46400 1727204603.10117: Calling groups_inventory to load vars for managed-node2 46400 1727204603.10120: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204603.10131: Calling all_plugins_play to load vars for managed-node2 46400 1727204603.10134: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204603.10137: Calling groups_plugins_play to load vars for managed-node2 46400 1727204603.11211: WORKER PROCESS EXITING 46400 1727204603.12046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204603.13120: done with get_vars() 46400 1727204603.13142: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.059) 0:01:33.416 ***** 46400 1727204603.13224: entering _queue_task() for managed-node2/service_facts 46400 1727204603.13481: worker is 1 (out of 1 available) 46400 1727204603.13494: exiting _queue_task() for managed-node2/service_facts 46400 1727204603.13508: done queuing things up, now waiting for results queue to drain 46400 1727204603.13510: waiting for pending results... 46400 1727204603.13702: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204603.13793: in run() - task 0affcd87-79f5-1303-fda8-000000001d87 46400 1727204603.13804: variable 'ansible_search_path' from source: unknown 46400 1727204603.13809: variable 'ansible_search_path' from source: unknown 46400 1727204603.13838: calling self._execute() 46400 1727204603.13915: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204603.13938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204603.13944: variable 'omit' from source: magic vars 46400 1727204603.14346: variable 'ansible_distribution_major_version' from source: facts 46400 1727204603.14369: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204603.14381: variable 'omit' from source: magic vars 46400 1727204603.14493: variable 'omit' from source: magic vars 46400 1727204603.14538: variable 'omit' from source: magic vars 46400 1727204603.14587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204603.14633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204603.14666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204603.14687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204603.14704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204603.14743: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204603.14752: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204603.14761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204603.14869: Set connection var ansible_shell_type to sh 46400 1727204603.14885: Set connection var ansible_shell_executable to /bin/sh 46400 1727204603.14896: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204603.14906: Set connection var ansible_connection to ssh 46400 1727204603.14917: Set connection var ansible_pipelining to False 46400 1727204603.14930: Set connection var ansible_timeout to 10 46400 1727204603.14968: variable 'ansible_shell_executable' from source: unknown 46400 1727204603.14978: variable 'ansible_connection' from source: unknown 46400 1727204603.14986: variable 'ansible_module_compression' from source: unknown 46400 1727204603.14993: variable 'ansible_shell_type' from source: unknown 46400 1727204603.15000: variable 'ansible_shell_executable' from source: unknown 46400 1727204603.15007: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204603.15016: variable 'ansible_pipelining' from source: unknown 46400 1727204603.15023: variable 'ansible_timeout' from source: unknown 46400 1727204603.15030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204603.15253: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204603.15280: variable 'omit' from source: magic vars 46400 1727204603.15290: starting attempt loop 46400 1727204603.15297: running the handler 46400 1727204603.15322: _low_level_execute_command(): starting 46400 1727204603.15329: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204603.15858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.15873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.15897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.15912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.15965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204603.15979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204603.16034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204603.17684: stdout chunk (state=3): >>>/root <<< 46400 1727204603.17876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204603.17879: stdout chunk (state=3): >>><<< 46400 1727204603.17882: stderr chunk (state=3): >>><<< 46400 1727204603.17972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204603.17977: _low_level_execute_command(): starting 46400 1727204603.17979: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627 `" && echo ansible-tmp-1727204603.1790185-52988-95851386603627="` echo /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627 `" ) && sleep 0' 46400 1727204603.18552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204603.18576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204603.18591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.18608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.18650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204603.18667: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204603.18691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.18731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204603.18746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204603.18759: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204603.18793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204603.18817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.18831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.18844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204603.18856: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204603.18893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.18982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204603.19006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204603.19052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204603.20945: stdout chunk (state=3): >>>ansible-tmp-1727204603.1790185-52988-95851386603627=/root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627 <<< 46400 1727204603.21062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204603.21167: stderr chunk (state=3): >>><<< 46400 1727204603.21180: stdout chunk (state=3): >>><<< 46400 1727204603.21376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204603.1790185-52988-95851386603627=/root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204603.21379: variable 'ansible_module_compression' from source: unknown 46400 1727204603.21382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204603.21384: variable 'ansible_facts' from source: unknown 46400 1727204603.21458: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/AnsiballZ_service_facts.py 46400 1727204603.21642: Sending initial data 46400 1727204603.21645: Sent initial data (161 bytes) 46400 1727204603.22648: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204603.22652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.22691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.22694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.22697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.22765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204603.22781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204603.22878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204603.24568: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204603.24598: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204603.24638: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpnhci6zdi /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/AnsiballZ_service_facts.py <<< 46400 1727204603.24675: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204603.25755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204603.25947: stderr chunk (state=3): >>><<< 46400 1727204603.25951: stdout chunk (state=3): >>><<< 46400 1727204603.25974: done transferring module to remote 46400 1727204603.25981: _low_level_execute_command(): starting 46400 1727204603.25986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/ /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/AnsiballZ_service_facts.py && sleep 0' 46400 1727204603.26449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.26456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.26491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204603.26495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204603.26506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204603.26512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.26519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204603.26526: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204603.26534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.26599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204603.26604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204603.26656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204603.28400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204603.28467: stderr chunk (state=3): >>><<< 46400 1727204603.28471: stdout chunk (state=3): >>><<< 46400 1727204603.28481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204603.28485: _low_level_execute_command(): starting 46400 1727204603.28490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/AnsiballZ_service_facts.py && sleep 0' 46400 1727204603.28966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204603.28970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204603.28999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.29011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204603.29062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204603.29078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204603.29133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204604.60025: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 46400 1727204604.60047: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 46400 1727204604.60101: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 46400 1727204604.60108: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204604.61395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204604.61399: stderr chunk (state=3): >>><<< 46400 1727204604.61401: stdout chunk (state=3): >>><<< 46400 1727204604.61429: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204604.63182: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204604.63190: _low_level_execute_command(): starting 46400 1727204604.63195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204603.1790185-52988-95851386603627/ > /dev/null 2>&1 && sleep 0' 46400 1727204604.65371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204604.65375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204604.65378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204604.65380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204604.65382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204604.65384: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204604.65386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.65388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204604.65390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204604.65392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204604.65394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204604.65396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204604.65398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204604.65400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204604.65402: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204604.65404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.65405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204604.65474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204604.65486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204604.65630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204604.67566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204604.67570: stdout chunk (state=3): >>><<< 46400 1727204604.67579: stderr chunk (state=3): >>><<< 46400 1727204604.67597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204604.67604: handler run complete 46400 1727204604.67791: variable 'ansible_facts' from source: unknown 46400 1727204604.68405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204604.69203: variable 'ansible_facts' from source: unknown 46400 1727204604.69332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204604.69532: attempt loop complete, returning result 46400 1727204604.69535: _execute() done 46400 1727204604.69538: dumping result to json 46400 1727204604.69600: done dumping result, returning 46400 1727204604.69609: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000001d87] 46400 1727204604.69623: sending task result for task 0affcd87-79f5-1303-fda8-000000001d87 46400 1727204604.87266: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d87 46400 1727204604.87270: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204604.87353: no more pending results, returning what we have 46400 1727204604.87356: results queue empty 46400 1727204604.87357: checking for any_errors_fatal 46400 1727204604.87360: done checking for any_errors_fatal 46400 1727204604.87361: checking for max_fail_percentage 46400 1727204604.87362: done checking for max_fail_percentage 46400 1727204604.87363: checking to see if all hosts have failed and the running result is not ok 46400 1727204604.87366: done checking to see if all hosts have failed 46400 1727204604.87366: getting the remaining hosts for this loop 46400 1727204604.87368: done getting the remaining hosts for this loop 46400 1727204604.87371: getting the next task for host managed-node2 46400 1727204604.87376: done getting next task for host managed-node2 46400 1727204604.87379: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204604.87386: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204604.87402: getting variables 46400 1727204604.87403: in VariableManager get_vars() 46400 1727204604.87429: Calling all_inventory to load vars for managed-node2 46400 1727204604.87432: Calling groups_inventory to load vars for managed-node2 46400 1727204604.87434: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204604.87441: Calling all_plugins_play to load vars for managed-node2 46400 1727204604.87444: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204604.87447: Calling groups_plugins_play to load vars for managed-node2 46400 1727204604.88691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204604.90214: done with get_vars() 46400 1727204604.90245: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:24 -0400 (0:00:01.771) 0:01:35.187 ***** 46400 1727204604.90342: entering _queue_task() for managed-node2/package_facts 46400 1727204604.90704: worker is 1 (out of 1 available) 46400 1727204604.90717: exiting _queue_task() for managed-node2/package_facts 46400 1727204604.90730: done queuing things up, now waiting for results queue to drain 46400 1727204604.90731: waiting for pending results... 46400 1727204604.91190: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204604.91379: in run() - task 0affcd87-79f5-1303-fda8-000000001d88 46400 1727204604.91401: variable 'ansible_search_path' from source: unknown 46400 1727204604.91409: variable 'ansible_search_path' from source: unknown 46400 1727204604.91457: calling self._execute() 46400 1727204604.91574: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204604.91587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204604.91601: variable 'omit' from source: magic vars 46400 1727204604.92058: variable 'ansible_distribution_major_version' from source: facts 46400 1727204604.92079: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204604.92094: variable 'omit' from source: magic vars 46400 1727204604.92194: variable 'omit' from source: magic vars 46400 1727204604.92238: variable 'omit' from source: magic vars 46400 1727204604.92288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204604.92335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204604.92362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204604.92387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204604.92402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204604.92442: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204604.92450: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204604.92457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204604.92595: Set connection var ansible_shell_type to sh 46400 1727204604.92625: Set connection var ansible_shell_executable to /bin/sh 46400 1727204604.92667: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204604.92685: Set connection var ansible_connection to ssh 46400 1727204604.92714: Set connection var ansible_pipelining to False 46400 1727204604.92724: Set connection var ansible_timeout to 10 46400 1727204604.92754: variable 'ansible_shell_executable' from source: unknown 46400 1727204604.92770: variable 'ansible_connection' from source: unknown 46400 1727204604.92780: variable 'ansible_module_compression' from source: unknown 46400 1727204604.92794: variable 'ansible_shell_type' from source: unknown 46400 1727204604.92808: variable 'ansible_shell_executable' from source: unknown 46400 1727204604.92822: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204604.92835: variable 'ansible_pipelining' from source: unknown 46400 1727204604.92843: variable 'ansible_timeout' from source: unknown 46400 1727204604.92846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204604.93016: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204604.93027: variable 'omit' from source: magic vars 46400 1727204604.93030: starting attempt loop 46400 1727204604.93033: running the handler 46400 1727204604.93052: _low_level_execute_command(): starting 46400 1727204604.93059: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204604.93570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204604.93575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204604.93613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.93627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.93680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204604.93687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204604.93771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204604.95440: stdout chunk (state=3): >>>/root <<< 46400 1727204604.95579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204604.95586: stdout chunk (state=3): >>><<< 46400 1727204604.95596: stderr chunk (state=3): >>><<< 46400 1727204604.95615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204604.95627: _low_level_execute_command(): starting 46400 1727204604.95633: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074 `" && echo ansible-tmp-1727204604.956146-53035-233251139840074="` echo /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074 `" ) && sleep 0' 46400 1727204604.96090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204604.96097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204604.96143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.96147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204604.96157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.96200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204604.96204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204604.96270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204604.98110: stdout chunk (state=3): >>>ansible-tmp-1727204604.956146-53035-233251139840074=/root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074 <<< 46400 1727204604.98221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204604.98270: stderr chunk (state=3): >>><<< 46400 1727204604.98273: stdout chunk (state=3): >>><<< 46400 1727204604.98294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204604.956146-53035-233251139840074=/root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204604.98330: variable 'ansible_module_compression' from source: unknown 46400 1727204604.98371: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204604.98422: variable 'ansible_facts' from source: unknown 46400 1727204604.98557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/AnsiballZ_package_facts.py 46400 1727204604.98674: Sending initial data 46400 1727204604.98677: Sent initial data (161 bytes) 46400 1727204604.99339: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204604.99345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204604.99387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204604.99396: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204604.99402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204604.99452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204604.99475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204604.99514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204605.01214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204605.01251: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204605.01289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp_qigqtzg /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/AnsiballZ_package_facts.py <<< 46400 1727204605.01325: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204605.03035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204605.03143: stderr chunk (state=3): >>><<< 46400 1727204605.03147: stdout chunk (state=3): >>><<< 46400 1727204605.03169: done transferring module to remote 46400 1727204605.03179: _low_level_execute_command(): starting 46400 1727204605.03184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/ /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/AnsiballZ_package_facts.py && sleep 0' 46400 1727204605.03651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.03657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204605.03692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204605.03706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.03718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204605.03770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204605.03788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204605.03821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204605.05518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204605.05567: stderr chunk (state=3): >>><<< 46400 1727204605.05570: stdout chunk (state=3): >>><<< 46400 1727204605.05583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204605.05590: _low_level_execute_command(): starting 46400 1727204605.05593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/AnsiballZ_package_facts.py && sleep 0' 46400 1727204605.06043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.06061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204605.06089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204605.06094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204605.06104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.06114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204605.06157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204605.06179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204605.06234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204605.52662: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 46400 1727204605.52674: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204605.52686: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204605.52748: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204605.52759: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204605.52767: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204605.52810: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204605.52819: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204605.52823: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204605.52851: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204605.52855: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204605.54339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204605.54425: stderr chunk (state=3): >>><<< 46400 1727204605.54428: stdout chunk (state=3): >>><<< 46400 1727204605.54780: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204605.57183: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204605.57212: _low_level_execute_command(): starting 46400 1727204605.57221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204604.956146-53035-233251139840074/ > /dev/null 2>&1 && sleep 0' 46400 1727204605.57921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204605.57938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204605.57951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.57969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204605.58009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204605.58020: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204605.58040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204605.58057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204605.58071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204605.58081: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204605.58092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204605.58105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204605.58122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204605.58139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204605.58156: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204605.58174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204605.58256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204605.58278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204605.58293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204605.58380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204605.60183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204605.60250: stderr chunk (state=3): >>><<< 46400 1727204605.60253: stdout chunk (state=3): >>><<< 46400 1727204605.60275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204605.60281: handler run complete 46400 1727204605.61219: variable 'ansible_facts' from source: unknown 46400 1727204605.61729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.64277: variable 'ansible_facts' from source: unknown 46400 1727204605.64605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.65383: attempt loop complete, returning result 46400 1727204605.65396: _execute() done 46400 1727204605.65399: dumping result to json 46400 1727204605.65639: done dumping result, returning 46400 1727204605.65650: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000001d88] 46400 1727204605.65657: sending task result for task 0affcd87-79f5-1303-fda8-000000001d88 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204605.68078: no more pending results, returning what we have 46400 1727204605.68081: results queue empty 46400 1727204605.68082: checking for any_errors_fatal 46400 1727204605.68088: done checking for any_errors_fatal 46400 1727204605.68088: checking for max_fail_percentage 46400 1727204605.68090: done checking for max_fail_percentage 46400 1727204605.68091: checking to see if all hosts have failed and the running result is not ok 46400 1727204605.68091: done checking to see if all hosts have failed 46400 1727204605.68092: getting the remaining hosts for this loop 46400 1727204605.68093: done getting the remaining hosts for this loop 46400 1727204605.68097: getting the next task for host managed-node2 46400 1727204605.68104: done getting next task for host managed-node2 46400 1727204605.68108: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204605.68114: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204605.68125: getting variables 46400 1727204605.68126: in VariableManager get_vars() 46400 1727204605.68162: Calling all_inventory to load vars for managed-node2 46400 1727204605.68166: Calling groups_inventory to load vars for managed-node2 46400 1727204605.68169: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204605.68179: Calling all_plugins_play to load vars for managed-node2 46400 1727204605.68185: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204605.68187: Calling groups_plugins_play to load vars for managed-node2 46400 1727204605.69112: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d88 46400 1727204605.69116: WORKER PROCESS EXITING 46400 1727204605.69789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.71519: done with get_vars() 46400 1727204605.71552: done getting variables 46400 1727204605.71609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.813) 0:01:36.001 ***** 46400 1727204605.71649: entering _queue_task() for managed-node2/debug 46400 1727204605.71997: worker is 1 (out of 1 available) 46400 1727204605.72010: exiting _queue_task() for managed-node2/debug 46400 1727204605.72022: done queuing things up, now waiting for results queue to drain 46400 1727204605.72024: waiting for pending results... 46400 1727204605.72335: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204605.72521: in run() - task 0affcd87-79f5-1303-fda8-000000001d2c 46400 1727204605.72544: variable 'ansible_search_path' from source: unknown 46400 1727204605.72552: variable 'ansible_search_path' from source: unknown 46400 1727204605.72598: calling self._execute() 46400 1727204605.72711: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.72723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.72741: variable 'omit' from source: magic vars 46400 1727204605.73206: variable 'ansible_distribution_major_version' from source: facts 46400 1727204605.73227: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204605.73237: variable 'omit' from source: magic vars 46400 1727204605.73311: variable 'omit' from source: magic vars 46400 1727204605.73423: variable 'network_provider' from source: set_fact 46400 1727204605.73453: variable 'omit' from source: magic vars 46400 1727204605.73504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204605.73550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204605.73581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204605.73602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204605.73623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204605.73659: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204605.73670: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.73677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.73783: Set connection var ansible_shell_type to sh 46400 1727204605.73799: Set connection var ansible_shell_executable to /bin/sh 46400 1727204605.73811: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204605.73820: Set connection var ansible_connection to ssh 46400 1727204605.73829: Set connection var ansible_pipelining to False 46400 1727204605.73843: Set connection var ansible_timeout to 10 46400 1727204605.73875: variable 'ansible_shell_executable' from source: unknown 46400 1727204605.73883: variable 'ansible_connection' from source: unknown 46400 1727204605.73889: variable 'ansible_module_compression' from source: unknown 46400 1727204605.73894: variable 'ansible_shell_type' from source: unknown 46400 1727204605.73900: variable 'ansible_shell_executable' from source: unknown 46400 1727204605.73905: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.73912: variable 'ansible_pipelining' from source: unknown 46400 1727204605.73917: variable 'ansible_timeout' from source: unknown 46400 1727204605.73924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.74062: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204605.74079: variable 'omit' from source: magic vars 46400 1727204605.74149: starting attempt loop 46400 1727204605.74158: running the handler 46400 1727204605.74403: handler run complete 46400 1727204605.74419: attempt loop complete, returning result 46400 1727204605.74425: _execute() done 46400 1727204605.74431: dumping result to json 46400 1727204605.74438: done dumping result, returning 46400 1727204605.74449: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000001d2c] 46400 1727204605.74483: sending task result for task 0affcd87-79f5-1303-fda8-000000001d2c ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204605.74661: no more pending results, returning what we have 46400 1727204605.74668: results queue empty 46400 1727204605.74670: checking for any_errors_fatal 46400 1727204605.74684: done checking for any_errors_fatal 46400 1727204605.74685: checking for max_fail_percentage 46400 1727204605.74688: done checking for max_fail_percentage 46400 1727204605.74689: checking to see if all hosts have failed and the running result is not ok 46400 1727204605.74690: done checking to see if all hosts have failed 46400 1727204605.74691: getting the remaining hosts for this loop 46400 1727204605.74695: done getting the remaining hosts for this loop 46400 1727204605.74699: getting the next task for host managed-node2 46400 1727204605.74709: done getting next task for host managed-node2 46400 1727204605.74713: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204605.74719: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204605.74734: getting variables 46400 1727204605.74735: in VariableManager get_vars() 46400 1727204605.74782: Calling all_inventory to load vars for managed-node2 46400 1727204605.74785: Calling groups_inventory to load vars for managed-node2 46400 1727204605.74787: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204605.74797: Calling all_plugins_play to load vars for managed-node2 46400 1727204605.74800: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204605.74803: Calling groups_plugins_play to load vars for managed-node2 46400 1727204605.75822: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d2c 46400 1727204605.75826: WORKER PROCESS EXITING 46400 1727204605.76724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.79804: done with get_vars() 46400 1727204605.79837: done getting variables 46400 1727204605.79902: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.082) 0:01:36.084 ***** 46400 1727204605.79949: entering _queue_task() for managed-node2/fail 46400 1727204605.80291: worker is 1 (out of 1 available) 46400 1727204605.80304: exiting _queue_task() for managed-node2/fail 46400 1727204605.80317: done queuing things up, now waiting for results queue to drain 46400 1727204605.80318: waiting for pending results... 46400 1727204605.81254: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204605.81639: in run() - task 0affcd87-79f5-1303-fda8-000000001d2d 46400 1727204605.81651: variable 'ansible_search_path' from source: unknown 46400 1727204605.81655: variable 'ansible_search_path' from source: unknown 46400 1727204605.81695: calling self._execute() 46400 1727204605.81909: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.81913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.82041: variable 'omit' from source: magic vars 46400 1727204605.82904: variable 'ansible_distribution_major_version' from source: facts 46400 1727204605.82916: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204605.83151: variable 'network_state' from source: role '' defaults 46400 1727204605.83164: Evaluated conditional (network_state != {}): False 46400 1727204605.83170: when evaluation is False, skipping this task 46400 1727204605.83176: _execute() done 46400 1727204605.83178: dumping result to json 46400 1727204605.83181: done dumping result, returning 46400 1727204605.83184: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000001d2d] 46400 1727204605.83192: sending task result for task 0affcd87-79f5-1303-fda8-000000001d2d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204605.83346: no more pending results, returning what we have 46400 1727204605.83351: results queue empty 46400 1727204605.83353: checking for any_errors_fatal 46400 1727204605.83360: done checking for any_errors_fatal 46400 1727204605.83361: checking for max_fail_percentage 46400 1727204605.83363: done checking for max_fail_percentage 46400 1727204605.83367: checking to see if all hosts have failed and the running result is not ok 46400 1727204605.83367: done checking to see if all hosts have failed 46400 1727204605.83368: getting the remaining hosts for this loop 46400 1727204605.83370: done getting the remaining hosts for this loop 46400 1727204605.83373: getting the next task for host managed-node2 46400 1727204605.83382: done getting next task for host managed-node2 46400 1727204605.83387: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204605.83393: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204605.83421: getting variables 46400 1727204605.83422: in VariableManager get_vars() 46400 1727204605.83471: Calling all_inventory to load vars for managed-node2 46400 1727204605.83474: Calling groups_inventory to load vars for managed-node2 46400 1727204605.83477: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204605.83489: Calling all_plugins_play to load vars for managed-node2 46400 1727204605.83492: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204605.83494: Calling groups_plugins_play to load vars for managed-node2 46400 1727204605.84219: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d2d 46400 1727204605.84222: WORKER PROCESS EXITING 46400 1727204605.85033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.86759: done with get_vars() 46400 1727204605.86794: done getting variables 46400 1727204605.86858: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.069) 0:01:36.153 ***** 46400 1727204605.86906: entering _queue_task() for managed-node2/fail 46400 1727204605.87281: worker is 1 (out of 1 available) 46400 1727204605.87298: exiting _queue_task() for managed-node2/fail 46400 1727204605.87312: done queuing things up, now waiting for results queue to drain 46400 1727204605.87314: waiting for pending results... 46400 1727204605.87623: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204605.87801: in run() - task 0affcd87-79f5-1303-fda8-000000001d2e 46400 1727204605.87822: variable 'ansible_search_path' from source: unknown 46400 1727204605.87830: variable 'ansible_search_path' from source: unknown 46400 1727204605.87882: calling self._execute() 46400 1727204605.88003: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.88016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.88030: variable 'omit' from source: magic vars 46400 1727204605.88454: variable 'ansible_distribution_major_version' from source: facts 46400 1727204605.88474: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204605.89301: variable 'network_state' from source: role '' defaults 46400 1727204605.89324: Evaluated conditional (network_state != {}): False 46400 1727204605.89428: when evaluation is False, skipping this task 46400 1727204605.89436: _execute() done 46400 1727204605.89444: dumping result to json 46400 1727204605.89452: done dumping result, returning 46400 1727204605.89463: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000001d2e] 46400 1727204605.89478: sending task result for task 0affcd87-79f5-1303-fda8-000000001d2e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204605.89633: no more pending results, returning what we have 46400 1727204605.89639: results queue empty 46400 1727204605.89641: checking for any_errors_fatal 46400 1727204605.89649: done checking for any_errors_fatal 46400 1727204605.89651: checking for max_fail_percentage 46400 1727204605.89653: done checking for max_fail_percentage 46400 1727204605.89654: checking to see if all hosts have failed and the running result is not ok 46400 1727204605.89655: done checking to see if all hosts have failed 46400 1727204605.89656: getting the remaining hosts for this loop 46400 1727204605.89657: done getting the remaining hosts for this loop 46400 1727204605.89662: getting the next task for host managed-node2 46400 1727204605.89674: done getting next task for host managed-node2 46400 1727204605.89679: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204605.89687: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204605.89718: getting variables 46400 1727204605.89721: in VariableManager get_vars() 46400 1727204605.89771: Calling all_inventory to load vars for managed-node2 46400 1727204605.89775: Calling groups_inventory to load vars for managed-node2 46400 1727204605.89777: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204605.89791: Calling all_plugins_play to load vars for managed-node2 46400 1727204605.89794: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204605.89798: Calling groups_plugins_play to load vars for managed-node2 46400 1727204605.91289: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d2e 46400 1727204605.91293: WORKER PROCESS EXITING 46400 1727204605.93020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204605.95951: done with get_vars() 46400 1727204605.95990: done getting variables 46400 1727204605.96055: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.097) 0:01:36.251 ***** 46400 1727204605.96670: entering _queue_task() for managed-node2/fail 46400 1727204605.97015: worker is 1 (out of 1 available) 46400 1727204605.97029: exiting _queue_task() for managed-node2/fail 46400 1727204605.97043: done queuing things up, now waiting for results queue to drain 46400 1727204605.97045: waiting for pending results... 46400 1727204605.97367: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204605.97542: in run() - task 0affcd87-79f5-1303-fda8-000000001d2f 46400 1727204605.97568: variable 'ansible_search_path' from source: unknown 46400 1727204605.97576: variable 'ansible_search_path' from source: unknown 46400 1727204605.97618: calling self._execute() 46400 1727204605.97742: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204605.97755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204605.97772: variable 'omit' from source: magic vars 46400 1727204605.98194: variable 'ansible_distribution_major_version' from source: facts 46400 1727204605.98213: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204605.98415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204606.03388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204606.03578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204606.03625: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204606.03665: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204606.03702: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204606.03876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.04054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.04089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.04138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.04242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.04467: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.04488: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204606.04494: when evaluation is False, skipping this task 46400 1727204606.04500: _execute() done 46400 1727204606.04505: dumping result to json 46400 1727204606.04511: done dumping result, returning 46400 1727204606.04520: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000001d2f] 46400 1727204606.04528: sending task result for task 0affcd87-79f5-1303-fda8-000000001d2f skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204606.04815: no more pending results, returning what we have 46400 1727204606.04820: results queue empty 46400 1727204606.04821: checking for any_errors_fatal 46400 1727204606.04829: done checking for any_errors_fatal 46400 1727204606.04830: checking for max_fail_percentage 46400 1727204606.04833: done checking for max_fail_percentage 46400 1727204606.04834: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.04835: done checking to see if all hosts have failed 46400 1727204606.04836: getting the remaining hosts for this loop 46400 1727204606.04839: done getting the remaining hosts for this loop 46400 1727204606.04844: getting the next task for host managed-node2 46400 1727204606.04855: done getting next task for host managed-node2 46400 1727204606.04859: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204606.04867: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.04898: getting variables 46400 1727204606.04900: in VariableManager get_vars() 46400 1727204606.04952: Calling all_inventory to load vars for managed-node2 46400 1727204606.04955: Calling groups_inventory to load vars for managed-node2 46400 1727204606.04958: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.04971: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.04975: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.04978: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.06273: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d2f 46400 1727204606.06276: WORKER PROCESS EXITING 46400 1727204606.08282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.11608: done with get_vars() 46400 1727204606.11649: done getting variables 46400 1727204606.11713: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.150) 0:01:36.402 ***** 46400 1727204606.11750: entering _queue_task() for managed-node2/dnf 46400 1727204606.12099: worker is 1 (out of 1 available) 46400 1727204606.12112: exiting _queue_task() for managed-node2/dnf 46400 1727204606.12124: done queuing things up, now waiting for results queue to drain 46400 1727204606.12125: waiting for pending results... 46400 1727204606.13082: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204606.13478: in run() - task 0affcd87-79f5-1303-fda8-000000001d30 46400 1727204606.13498: variable 'ansible_search_path' from source: unknown 46400 1727204606.13505: variable 'ansible_search_path' from source: unknown 46400 1727204606.13549: calling self._execute() 46400 1727204606.13750: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.13771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.13882: variable 'omit' from source: magic vars 46400 1727204606.14581: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.14644: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.15040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204606.19128: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204606.19203: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204606.19252: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204606.19294: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204606.19336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204606.19420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.19463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.19498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.19555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.19577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.19704: variable 'ansible_distribution' from source: facts 46400 1727204606.19714: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.19735: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204606.19870: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204606.20006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.20034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.20063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.20114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.20129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.20173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.20212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.20242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.20286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.20310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.20350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.20380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.20419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.20466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.20486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.20667: variable 'network_connections' from source: include params 46400 1727204606.20685: variable 'interface' from source: play vars 46400 1727204606.20760: variable 'interface' from source: play vars 46400 1727204606.20845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204606.21048: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204606.21099: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204606.21134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204606.21177: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204606.21220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204606.21246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204606.21294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.21326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204606.21379: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204606.21641: variable 'network_connections' from source: include params 46400 1727204606.21651: variable 'interface' from source: play vars 46400 1727204606.21724: variable 'interface' from source: play vars 46400 1727204606.21754: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204606.21762: when evaluation is False, skipping this task 46400 1727204606.21772: _execute() done 46400 1727204606.21779: dumping result to json 46400 1727204606.21786: done dumping result, returning 46400 1727204606.21797: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001d30] 46400 1727204606.21808: sending task result for task 0affcd87-79f5-1303-fda8-000000001d30 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204606.21978: no more pending results, returning what we have 46400 1727204606.21984: results queue empty 46400 1727204606.21985: checking for any_errors_fatal 46400 1727204606.21991: done checking for any_errors_fatal 46400 1727204606.21992: checking for max_fail_percentage 46400 1727204606.21994: done checking for max_fail_percentage 46400 1727204606.21996: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.21996: done checking to see if all hosts have failed 46400 1727204606.21997: getting the remaining hosts for this loop 46400 1727204606.21999: done getting the remaining hosts for this loop 46400 1727204606.22003: getting the next task for host managed-node2 46400 1727204606.22013: done getting next task for host managed-node2 46400 1727204606.22017: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204606.22023: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.22050: getting variables 46400 1727204606.22052: in VariableManager get_vars() 46400 1727204606.22097: Calling all_inventory to load vars for managed-node2 46400 1727204606.22099: Calling groups_inventory to load vars for managed-node2 46400 1727204606.22101: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.22111: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.22113: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.22116: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.24772: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d30 46400 1727204606.24776: WORKER PROCESS EXITING 46400 1727204606.25826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.29521: done with get_vars() 46400 1727204606.29557: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204606.29644: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.179) 0:01:36.581 ***** 46400 1727204606.29682: entering _queue_task() for managed-node2/yum 46400 1727204606.30057: worker is 1 (out of 1 available) 46400 1727204606.30072: exiting _queue_task() for managed-node2/yum 46400 1727204606.30090: done queuing things up, now waiting for results queue to drain 46400 1727204606.30092: waiting for pending results... 46400 1727204606.30401: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204606.30578: in run() - task 0affcd87-79f5-1303-fda8-000000001d31 46400 1727204606.30597: variable 'ansible_search_path' from source: unknown 46400 1727204606.30605: variable 'ansible_search_path' from source: unknown 46400 1727204606.30653: calling self._execute() 46400 1727204606.30762: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.30776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.30789: variable 'omit' from source: magic vars 46400 1727204606.31217: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.31231: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.31409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204606.33985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204606.34070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204606.34116: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204606.34165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204606.34197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204606.34288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.34331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.34373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.34420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.34442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.34554: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.34586: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204606.34593: when evaluation is False, skipping this task 46400 1727204606.34599: _execute() done 46400 1727204606.34605: dumping result to json 46400 1727204606.34612: done dumping result, returning 46400 1727204606.34622: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001d31] 46400 1727204606.34632: sending task result for task 0affcd87-79f5-1303-fda8-000000001d31 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204606.34792: no more pending results, returning what we have 46400 1727204606.34796: results queue empty 46400 1727204606.34797: checking for any_errors_fatal 46400 1727204606.34806: done checking for any_errors_fatal 46400 1727204606.34807: checking for max_fail_percentage 46400 1727204606.34809: done checking for max_fail_percentage 46400 1727204606.34810: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.34811: done checking to see if all hosts have failed 46400 1727204606.34812: getting the remaining hosts for this loop 46400 1727204606.34813: done getting the remaining hosts for this loop 46400 1727204606.34818: getting the next task for host managed-node2 46400 1727204606.34827: done getting next task for host managed-node2 46400 1727204606.34832: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204606.34838: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.34868: getting variables 46400 1727204606.34870: in VariableManager get_vars() 46400 1727204606.34917: Calling all_inventory to load vars for managed-node2 46400 1727204606.34919: Calling groups_inventory to load vars for managed-node2 46400 1727204606.34922: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.34933: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.34935: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.34938: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.35989: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d31 46400 1727204606.35993: WORKER PROCESS EXITING 46400 1727204606.36806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.38765: done with get_vars() 46400 1727204606.38789: done getting variables 46400 1727204606.38970: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.093) 0:01:36.674 ***** 46400 1727204606.39010: entering _queue_task() for managed-node2/fail 46400 1727204606.39777: worker is 1 (out of 1 available) 46400 1727204606.39791: exiting _queue_task() for managed-node2/fail 46400 1727204606.39804: done queuing things up, now waiting for results queue to drain 46400 1727204606.39806: waiting for pending results... 46400 1727204606.41030: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204606.41206: in run() - task 0affcd87-79f5-1303-fda8-000000001d32 46400 1727204606.41227: variable 'ansible_search_path' from source: unknown 46400 1727204606.41235: variable 'ansible_search_path' from source: unknown 46400 1727204606.41286: calling self._execute() 46400 1727204606.41397: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.41409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.41421: variable 'omit' from source: magic vars 46400 1727204606.41835: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.41854: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.41990: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204606.42209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204606.44728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204606.44805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204606.44849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204606.44893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204606.44922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204606.45010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.45058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.45099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.45143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.45165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.45260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.45295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.45330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.45403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.45428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.45505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.45538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.45670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.45715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.45734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.45956: variable 'network_connections' from source: include params 46400 1727204606.45978: variable 'interface' from source: play vars 46400 1727204606.46055: variable 'interface' from source: play vars 46400 1727204606.46138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204606.46321: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204606.46362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204606.46398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204606.46438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204606.46486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204606.46511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204606.46547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.46580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204606.46651: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204606.47044: variable 'network_connections' from source: include params 46400 1727204606.47061: variable 'interface' from source: play vars 46400 1727204606.47151: variable 'interface' from source: play vars 46400 1727204606.48007: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204606.48014: when evaluation is False, skipping this task 46400 1727204606.48020: _execute() done 46400 1727204606.48026: dumping result to json 46400 1727204606.48033: done dumping result, returning 46400 1727204606.48046: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001d32] 46400 1727204606.48057: sending task result for task 0affcd87-79f5-1303-fda8-000000001d32 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204606.48374: no more pending results, returning what we have 46400 1727204606.48379: results queue empty 46400 1727204606.48380: checking for any_errors_fatal 46400 1727204606.48390: done checking for any_errors_fatal 46400 1727204606.48391: checking for max_fail_percentage 46400 1727204606.48393: done checking for max_fail_percentage 46400 1727204606.48394: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.48395: done checking to see if all hosts have failed 46400 1727204606.48396: getting the remaining hosts for this loop 46400 1727204606.48397: done getting the remaining hosts for this loop 46400 1727204606.48402: getting the next task for host managed-node2 46400 1727204606.48413: done getting next task for host managed-node2 46400 1727204606.48419: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204606.48425: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.48456: getting variables 46400 1727204606.48458: in VariableManager get_vars() 46400 1727204606.48509: Calling all_inventory to load vars for managed-node2 46400 1727204606.48513: Calling groups_inventory to load vars for managed-node2 46400 1727204606.48515: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.48527: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.48530: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.48533: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.50331: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d32 46400 1727204606.50335: WORKER PROCESS EXITING 46400 1727204606.51342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.55472: done with get_vars() 46400 1727204606.55509: done getting variables 46400 1727204606.55576: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.166) 0:01:36.840 ***** 46400 1727204606.55615: entering _queue_task() for managed-node2/package 46400 1727204606.56484: worker is 1 (out of 1 available) 46400 1727204606.56499: exiting _queue_task() for managed-node2/package 46400 1727204606.56513: done queuing things up, now waiting for results queue to drain 46400 1727204606.56515: waiting for pending results... 46400 1727204606.57454: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204606.57856: in run() - task 0affcd87-79f5-1303-fda8-000000001d33 46400 1727204606.57885: variable 'ansible_search_path' from source: unknown 46400 1727204606.57899: variable 'ansible_search_path' from source: unknown 46400 1727204606.57946: calling self._execute() 46400 1727204606.58228: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.58241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.58282: variable 'omit' from source: magic vars 46400 1727204606.59062: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.59221: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.59675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204606.60070: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204606.60235: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204606.60332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204606.60491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204606.60856: variable 'network_packages' from source: role '' defaults 46400 1727204606.61092: variable '__network_provider_setup' from source: role '' defaults 46400 1727204606.61109: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204606.61299: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204606.61312: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204606.61384: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204606.61695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204606.66461: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204606.66671: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204606.66970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204606.66974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204606.66976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204606.66978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.66981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.67096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.67137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.67152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.67252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.67277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.67306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.67340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.67353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.67598: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204606.67703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.67729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.67752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.67790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.67802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.67889: variable 'ansible_python' from source: facts 46400 1727204606.67907: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204606.67996: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204606.68081: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204606.68210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.68232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.68255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.68302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.68315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.68358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204606.68390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204606.68415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.68456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204606.68472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204606.68622: variable 'network_connections' from source: include params 46400 1727204606.68628: variable 'interface' from source: play vars 46400 1727204606.68734: variable 'interface' from source: play vars 46400 1727204606.69215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204606.69248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204606.69284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204606.69311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204606.69368: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204606.69665: variable 'network_connections' from source: include params 46400 1727204606.69669: variable 'interface' from source: play vars 46400 1727204606.69770: variable 'interface' from source: play vars 46400 1727204606.69808: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204606.69886: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204606.70198: variable 'network_connections' from source: include params 46400 1727204606.70202: variable 'interface' from source: play vars 46400 1727204606.70271: variable 'interface' from source: play vars 46400 1727204606.70297: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204606.70377: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204606.71519: variable 'network_connections' from source: include params 46400 1727204606.71524: variable 'interface' from source: play vars 46400 1727204606.71588: variable 'interface' from source: play vars 46400 1727204606.71754: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204606.71813: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204606.71819: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204606.71992: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204606.72424: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204606.73495: variable 'network_connections' from source: include params 46400 1727204606.73499: variable 'interface' from source: play vars 46400 1727204606.73561: variable 'interface' from source: play vars 46400 1727204606.73570: variable 'ansible_distribution' from source: facts 46400 1727204606.73688: variable '__network_rh_distros' from source: role '' defaults 46400 1727204606.73694: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.73709: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204606.73979: variable 'ansible_distribution' from source: facts 46400 1727204606.73983: variable '__network_rh_distros' from source: role '' defaults 46400 1727204606.73988: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.74005: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204606.74386: variable 'ansible_distribution' from source: facts 46400 1727204606.74390: variable '__network_rh_distros' from source: role '' defaults 46400 1727204606.74395: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.74439: variable 'network_provider' from source: set_fact 46400 1727204606.74566: variable 'ansible_facts' from source: unknown 46400 1727204606.76083: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204606.76087: when evaluation is False, skipping this task 46400 1727204606.76090: _execute() done 46400 1727204606.76092: dumping result to json 46400 1727204606.76094: done dumping result, returning 46400 1727204606.76101: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000001d33] 46400 1727204606.76108: sending task result for task 0affcd87-79f5-1303-fda8-000000001d33 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204606.76270: no more pending results, returning what we have 46400 1727204606.76274: results queue empty 46400 1727204606.76275: checking for any_errors_fatal 46400 1727204606.76282: done checking for any_errors_fatal 46400 1727204606.76283: checking for max_fail_percentage 46400 1727204606.76285: done checking for max_fail_percentage 46400 1727204606.76286: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.76287: done checking to see if all hosts have failed 46400 1727204606.76287: getting the remaining hosts for this loop 46400 1727204606.76289: done getting the remaining hosts for this loop 46400 1727204606.76293: getting the next task for host managed-node2 46400 1727204606.76303: done getting next task for host managed-node2 46400 1727204606.76307: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204606.76312: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.76341: getting variables 46400 1727204606.76343: in VariableManager get_vars() 46400 1727204606.76391: Calling all_inventory to load vars for managed-node2 46400 1727204606.76394: Calling groups_inventory to load vars for managed-node2 46400 1727204606.76400: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d33 46400 1727204606.76406: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.76412: WORKER PROCESS EXITING 46400 1727204606.76426: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.76429: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.76432: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.79598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.83000: done with get_vars() 46400 1727204606.83039: done getting variables 46400 1727204606.83108: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.275) 0:01:37.116 ***** 46400 1727204606.83148: entering _queue_task() for managed-node2/package 46400 1727204606.83507: worker is 1 (out of 1 available) 46400 1727204606.83520: exiting _queue_task() for managed-node2/package 46400 1727204606.83532: done queuing things up, now waiting for results queue to drain 46400 1727204606.83534: waiting for pending results... 46400 1727204606.84685: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204606.84983: in run() - task 0affcd87-79f5-1303-fda8-000000001d34 46400 1727204606.85090: variable 'ansible_search_path' from source: unknown 46400 1727204606.85098: variable 'ansible_search_path' from source: unknown 46400 1727204606.85187: calling self._execute() 46400 1727204606.85492: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.85505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.85574: variable 'omit' from source: magic vars 46400 1727204606.86671: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.86695: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.86974: variable 'network_state' from source: role '' defaults 46400 1727204606.86993: Evaluated conditional (network_state != {}): False 46400 1727204606.87021: when evaluation is False, skipping this task 46400 1727204606.87028: _execute() done 46400 1727204606.87035: dumping result to json 46400 1727204606.87115: done dumping result, returning 46400 1727204606.87132: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001d34] 46400 1727204606.87153: sending task result for task 0affcd87-79f5-1303-fda8-000000001d34 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204606.87324: no more pending results, returning what we have 46400 1727204606.87328: results queue empty 46400 1727204606.87329: checking for any_errors_fatal 46400 1727204606.87336: done checking for any_errors_fatal 46400 1727204606.87337: checking for max_fail_percentage 46400 1727204606.87338: done checking for max_fail_percentage 46400 1727204606.87339: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.87340: done checking to see if all hosts have failed 46400 1727204606.87341: getting the remaining hosts for this loop 46400 1727204606.87342: done getting the remaining hosts for this loop 46400 1727204606.87346: getting the next task for host managed-node2 46400 1727204606.87356: done getting next task for host managed-node2 46400 1727204606.87360: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204606.87370: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.87390: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d34 46400 1727204606.87394: WORKER PROCESS EXITING 46400 1727204606.87416: getting variables 46400 1727204606.87418: in VariableManager get_vars() 46400 1727204606.87473: Calling all_inventory to load vars for managed-node2 46400 1727204606.87477: Calling groups_inventory to load vars for managed-node2 46400 1727204606.87479: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.87494: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.87501: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.87503: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.89251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.90193: done with get_vars() 46400 1727204606.90211: done getting variables 46400 1727204606.90257: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.071) 0:01:37.187 ***** 46400 1727204606.90286: entering _queue_task() for managed-node2/package 46400 1727204606.90572: worker is 1 (out of 1 available) 46400 1727204606.90631: exiting _queue_task() for managed-node2/package 46400 1727204606.90677: done queuing things up, now waiting for results queue to drain 46400 1727204606.90703: waiting for pending results... 46400 1727204606.91065: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204606.91221: in run() - task 0affcd87-79f5-1303-fda8-000000001d35 46400 1727204606.91245: variable 'ansible_search_path' from source: unknown 46400 1727204606.91257: variable 'ansible_search_path' from source: unknown 46400 1727204606.91298: calling self._execute() 46400 1727204606.91412: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.91428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.91456: variable 'omit' from source: magic vars 46400 1727204606.92149: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.92169: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.92474: variable 'network_state' from source: role '' defaults 46400 1727204606.92524: Evaluated conditional (network_state != {}): False 46400 1727204606.92528: when evaluation is False, skipping this task 46400 1727204606.92530: _execute() done 46400 1727204606.92533: dumping result to json 46400 1727204606.92535: done dumping result, returning 46400 1727204606.92538: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000001d35] 46400 1727204606.92540: sending task result for task 0affcd87-79f5-1303-fda8-000000001d35 46400 1727204606.92658: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d35 46400 1727204606.92661: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204606.92732: no more pending results, returning what we have 46400 1727204606.92736: results queue empty 46400 1727204606.92737: checking for any_errors_fatal 46400 1727204606.92743: done checking for any_errors_fatal 46400 1727204606.92744: checking for max_fail_percentage 46400 1727204606.92746: done checking for max_fail_percentage 46400 1727204606.92747: checking to see if all hosts have failed and the running result is not ok 46400 1727204606.92747: done checking to see if all hosts have failed 46400 1727204606.92748: getting the remaining hosts for this loop 46400 1727204606.92750: done getting the remaining hosts for this loop 46400 1727204606.92754: getting the next task for host managed-node2 46400 1727204606.92762: done getting next task for host managed-node2 46400 1727204606.92767: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204606.92773: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204606.92795: getting variables 46400 1727204606.92796: in VariableManager get_vars() 46400 1727204606.92833: Calling all_inventory to load vars for managed-node2 46400 1727204606.92836: Calling groups_inventory to load vars for managed-node2 46400 1727204606.92839: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204606.92848: Calling all_plugins_play to load vars for managed-node2 46400 1727204606.92851: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204606.92853: Calling groups_plugins_play to load vars for managed-node2 46400 1727204606.93818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204606.96245: done with get_vars() 46400 1727204606.96324: done getting variables 46400 1727204606.96395: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.061) 0:01:37.248 ***** 46400 1727204606.96442: entering _queue_task() for managed-node2/service 46400 1727204606.96802: worker is 1 (out of 1 available) 46400 1727204606.96816: exiting _queue_task() for managed-node2/service 46400 1727204606.96827: done queuing things up, now waiting for results queue to drain 46400 1727204606.96829: waiting for pending results... 46400 1727204606.97218: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204606.97346: in run() - task 0affcd87-79f5-1303-fda8-000000001d36 46400 1727204606.97357: variable 'ansible_search_path' from source: unknown 46400 1727204606.97366: variable 'ansible_search_path' from source: unknown 46400 1727204606.97394: calling self._execute() 46400 1727204606.97479: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204606.97484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204606.97492: variable 'omit' from source: magic vars 46400 1727204606.97776: variable 'ansible_distribution_major_version' from source: facts 46400 1727204606.97786: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204606.97874: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204606.98032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204607.01043: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204607.01102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204607.01138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204607.01177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204607.01221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204607.01300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.01343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.01369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.01414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.01428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.01472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.01493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.01517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.01554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.01569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.01609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.01635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.01832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.01836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.01838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.01895: variable 'network_connections' from source: include params 46400 1727204607.01908: variable 'interface' from source: play vars 46400 1727204607.01986: variable 'interface' from source: play vars 46400 1727204607.02088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204607.02479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204607.02548: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204607.02578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204607.02623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204607.02671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204607.02694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204607.02719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.02746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204607.02800: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204607.03034: variable 'network_connections' from source: include params 46400 1727204607.03038: variable 'interface' from source: play vars 46400 1727204607.03100: variable 'interface' from source: play vars 46400 1727204607.03123: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204607.03127: when evaluation is False, skipping this task 46400 1727204607.03130: _execute() done 46400 1727204607.03132: dumping result to json 46400 1727204607.03134: done dumping result, returning 46400 1727204607.03141: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000001d36] 46400 1727204607.03148: sending task result for task 0affcd87-79f5-1303-fda8-000000001d36 46400 1727204607.03250: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d36 46400 1727204607.03258: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204607.03307: no more pending results, returning what we have 46400 1727204607.03311: results queue empty 46400 1727204607.03312: checking for any_errors_fatal 46400 1727204607.03318: done checking for any_errors_fatal 46400 1727204607.03319: checking for max_fail_percentage 46400 1727204607.03321: done checking for max_fail_percentage 46400 1727204607.03322: checking to see if all hosts have failed and the running result is not ok 46400 1727204607.03323: done checking to see if all hosts have failed 46400 1727204607.03324: getting the remaining hosts for this loop 46400 1727204607.03325: done getting the remaining hosts for this loop 46400 1727204607.03329: getting the next task for host managed-node2 46400 1727204607.03339: done getting next task for host managed-node2 46400 1727204607.03343: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204607.03348: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204607.03377: getting variables 46400 1727204607.03379: in VariableManager get_vars() 46400 1727204607.03420: Calling all_inventory to load vars for managed-node2 46400 1727204607.03422: Calling groups_inventory to load vars for managed-node2 46400 1727204607.03424: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204607.03434: Calling all_plugins_play to load vars for managed-node2 46400 1727204607.03436: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204607.03438: Calling groups_plugins_play to load vars for managed-node2 46400 1727204607.04871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204607.06696: done with get_vars() 46400 1727204607.06717: done getting variables 46400 1727204607.06760: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.103) 0:01:37.352 ***** 46400 1727204607.06793: entering _queue_task() for managed-node2/service 46400 1727204607.07028: worker is 1 (out of 1 available) 46400 1727204607.07042: exiting _queue_task() for managed-node2/service 46400 1727204607.07054: done queuing things up, now waiting for results queue to drain 46400 1727204607.07056: waiting for pending results... 46400 1727204607.07247: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204607.07367: in run() - task 0affcd87-79f5-1303-fda8-000000001d37 46400 1727204607.07378: variable 'ansible_search_path' from source: unknown 46400 1727204607.07382: variable 'ansible_search_path' from source: unknown 46400 1727204607.07410: calling self._execute() 46400 1727204607.07495: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.07499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.07508: variable 'omit' from source: magic vars 46400 1727204607.07789: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.07799: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204607.07911: variable 'network_provider' from source: set_fact 46400 1727204607.07915: variable 'network_state' from source: role '' defaults 46400 1727204607.07924: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204607.07929: variable 'omit' from source: magic vars 46400 1727204607.07973: variable 'omit' from source: magic vars 46400 1727204607.07994: variable 'network_service_name' from source: role '' defaults 46400 1727204607.08043: variable 'network_service_name' from source: role '' defaults 46400 1727204607.08116: variable '__network_provider_setup' from source: role '' defaults 46400 1727204607.08120: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204607.08166: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204607.08174: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204607.08221: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204607.08370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204607.10196: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204607.10240: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204607.10268: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204607.10296: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204607.10316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204607.10377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.10409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.10427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.10455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.10467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.10500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.10518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.10534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.10565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.10574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.10718: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204607.10797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.10814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.10834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.10862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.10881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.10940: variable 'ansible_python' from source: facts 46400 1727204607.10953: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204607.11012: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204607.11068: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204607.11150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.11170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.11188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.11214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.11224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.11261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.11280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.11297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.11323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.11334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.11427: variable 'network_connections' from source: include params 46400 1727204607.11434: variable 'interface' from source: play vars 46400 1727204607.11492: variable 'interface' from source: play vars 46400 1727204607.11566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204607.11689: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204607.11727: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204607.11758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204607.11790: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204607.11835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204607.11856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204607.11882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.11904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204607.11944: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204607.12132: variable 'network_connections' from source: include params 46400 1727204607.12142: variable 'interface' from source: play vars 46400 1727204607.12193: variable 'interface' from source: play vars 46400 1727204607.12217: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204607.12275: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204607.12458: variable 'network_connections' from source: include params 46400 1727204607.12469: variable 'interface' from source: play vars 46400 1727204607.12514: variable 'interface' from source: play vars 46400 1727204607.12531: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204607.12588: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204607.12771: variable 'network_connections' from source: include params 46400 1727204607.12774: variable 'interface' from source: play vars 46400 1727204607.12826: variable 'interface' from source: play vars 46400 1727204607.12867: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204607.12909: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204607.12913: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204607.12956: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204607.13093: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204607.13402: variable 'network_connections' from source: include params 46400 1727204607.13405: variable 'interface' from source: play vars 46400 1727204607.13451: variable 'interface' from source: play vars 46400 1727204607.13455: variable 'ansible_distribution' from source: facts 46400 1727204607.13457: variable '__network_rh_distros' from source: role '' defaults 46400 1727204607.13466: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.13477: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204607.13590: variable 'ansible_distribution' from source: facts 46400 1727204607.13593: variable '__network_rh_distros' from source: role '' defaults 46400 1727204607.13598: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.13608: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204607.13722: variable 'ansible_distribution' from source: facts 46400 1727204607.13726: variable '__network_rh_distros' from source: role '' defaults 46400 1727204607.13729: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.13754: variable 'network_provider' from source: set_fact 46400 1727204607.13778: variable 'omit' from source: magic vars 46400 1727204607.13799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204607.13820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204607.13834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204607.13847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204607.13855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204607.13880: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204607.13883: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.13890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.13950: Set connection var ansible_shell_type to sh 46400 1727204607.13958: Set connection var ansible_shell_executable to /bin/sh 46400 1727204607.13965: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204607.13970: Set connection var ansible_connection to ssh 46400 1727204607.13975: Set connection var ansible_pipelining to False 46400 1727204607.13979: Set connection var ansible_timeout to 10 46400 1727204607.14002: variable 'ansible_shell_executable' from source: unknown 46400 1727204607.14005: variable 'ansible_connection' from source: unknown 46400 1727204607.14009: variable 'ansible_module_compression' from source: unknown 46400 1727204607.14011: variable 'ansible_shell_type' from source: unknown 46400 1727204607.14013: variable 'ansible_shell_executable' from source: unknown 46400 1727204607.14015: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.14018: variable 'ansible_pipelining' from source: unknown 46400 1727204607.14022: variable 'ansible_timeout' from source: unknown 46400 1727204607.14024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.14105: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204607.14114: variable 'omit' from source: magic vars 46400 1727204607.14117: starting attempt loop 46400 1727204607.14119: running the handler 46400 1727204607.14177: variable 'ansible_facts' from source: unknown 46400 1727204607.14614: _low_level_execute_command(): starting 46400 1727204607.14620: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204607.15144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.15153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.15185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.15198: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204607.15210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.15263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204607.15276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.15340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.16996: stdout chunk (state=3): >>>/root <<< 46400 1727204607.17103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204607.17163: stderr chunk (state=3): >>><<< 46400 1727204607.17174: stdout chunk (state=3): >>><<< 46400 1727204607.17193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204607.17204: _low_level_execute_command(): starting 46400 1727204607.17210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948 `" && echo ansible-tmp-1727204607.1719391-53113-135458974999948="` echo /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948 `" ) && sleep 0' 46400 1727204607.17681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.17698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.17714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204607.17726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.17736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.17787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204607.17796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.17856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.19741: stdout chunk (state=3): >>>ansible-tmp-1727204607.1719391-53113-135458974999948=/root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948 <<< 46400 1727204607.19862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204607.19919: stderr chunk (state=3): >>><<< 46400 1727204607.19922: stdout chunk (state=3): >>><<< 46400 1727204607.19935: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204607.1719391-53113-135458974999948=/root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204607.19967: variable 'ansible_module_compression' from source: unknown 46400 1727204607.20008: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204607.20060: variable 'ansible_facts' from source: unknown 46400 1727204607.20199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/AnsiballZ_systemd.py 46400 1727204607.20397: Sending initial data 46400 1727204607.20400: Sent initial data (156 bytes) 46400 1727204607.21439: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204607.21489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.21527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.21541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.21549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.21597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204607.21600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204607.21611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.21669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.23408: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204607.23449: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204607.23525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpp4dqc8qt /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/AnsiballZ_systemd.py <<< 46400 1727204607.24669: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204607.25811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204607.26071: stderr chunk (state=3): >>><<< 46400 1727204607.26074: stdout chunk (state=3): >>><<< 46400 1727204607.26077: done transferring module to remote 46400 1727204607.26079: _low_level_execute_command(): starting 46400 1727204607.26082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/ /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/AnsiballZ_systemd.py && sleep 0' 46400 1727204607.26672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204607.26687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.26702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.26719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.26761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.26775: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204607.26789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.26805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204607.26817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204607.26829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204607.26845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.26860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.26892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.26906: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204607.26921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.26993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204607.27012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204607.27027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.27112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.28931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204607.29013: stderr chunk (state=3): >>><<< 46400 1727204607.29016: stdout chunk (state=3): >>><<< 46400 1727204607.29036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204607.29039: _low_level_execute_command(): starting 46400 1727204607.29044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/AnsiballZ_systemd.py && sleep 0' 46400 1727204607.29711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204607.29719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.29729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.29742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.29787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.29795: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204607.29804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.29818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204607.29827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204607.29830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204607.29838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.29847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.29858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.29872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.29881: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204607.29887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.29959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204607.29977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204607.29987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.30077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.55639: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204607.55662: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2225837000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204607.57314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204607.57399: stderr chunk (state=3): >>><<< 46400 1727204607.57403: stdout chunk (state=3): >>><<< 46400 1727204607.57571: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6938624", "MemoryAvailable": "infinity", "CPUUsageNSec": "2225837000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204607.57718: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204607.57722: _low_level_execute_command(): starting 46400 1727204607.57725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204607.1719391-53113-135458974999948/ > /dev/null 2>&1 && sleep 0' 46400 1727204607.58788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204607.59214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.59231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.59253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.59301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.59315: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204607.59332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.59351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204607.59368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204607.59380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204607.59391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204607.59404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204607.59421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204607.59433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204607.59447: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204607.59487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204607.59567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204607.59676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204607.59694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204607.59774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204607.61683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204607.61723: stderr chunk (state=3): >>><<< 46400 1727204607.61726: stdout chunk (state=3): >>><<< 46400 1727204607.61870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204607.61874: handler run complete 46400 1727204607.61876: attempt loop complete, returning result 46400 1727204607.61878: _execute() done 46400 1727204607.61880: dumping result to json 46400 1727204607.61882: done dumping result, returning 46400 1727204607.61884: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000001d37] 46400 1727204607.61886: sending task result for task 0affcd87-79f5-1303-fda8-000000001d37 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204607.62172: no more pending results, returning what we have 46400 1727204607.62176: results queue empty 46400 1727204607.62177: checking for any_errors_fatal 46400 1727204607.62184: done checking for any_errors_fatal 46400 1727204607.62185: checking for max_fail_percentage 46400 1727204607.62187: done checking for max_fail_percentage 46400 1727204607.62188: checking to see if all hosts have failed and the running result is not ok 46400 1727204607.62189: done checking to see if all hosts have failed 46400 1727204607.62189: getting the remaining hosts for this loop 46400 1727204607.62191: done getting the remaining hosts for this loop 46400 1727204607.62194: getting the next task for host managed-node2 46400 1727204607.62203: done getting next task for host managed-node2 46400 1727204607.62207: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204607.62213: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204607.62230: getting variables 46400 1727204607.62232: in VariableManager get_vars() 46400 1727204607.62275: Calling all_inventory to load vars for managed-node2 46400 1727204607.62277: Calling groups_inventory to load vars for managed-node2 46400 1727204607.62280: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204607.62290: Calling all_plugins_play to load vars for managed-node2 46400 1727204607.62293: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204607.62296: Calling groups_plugins_play to load vars for managed-node2 46400 1727204607.63243: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d37 46400 1727204607.63250: WORKER PROCESS EXITING 46400 1727204607.64402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204607.67178: done with get_vars() 46400 1727204607.67209: done getting variables 46400 1727204607.67284: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.605) 0:01:37.957 ***** 46400 1727204607.67322: entering _queue_task() for managed-node2/service 46400 1727204607.67708: worker is 1 (out of 1 available) 46400 1727204607.67721: exiting _queue_task() for managed-node2/service 46400 1727204607.67734: done queuing things up, now waiting for results queue to drain 46400 1727204607.67735: waiting for pending results... 46400 1727204607.68050: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204607.68320: in run() - task 0affcd87-79f5-1303-fda8-000000001d38 46400 1727204607.68342: variable 'ansible_search_path' from source: unknown 46400 1727204607.68349: variable 'ansible_search_path' from source: unknown 46400 1727204607.68395: calling self._execute() 46400 1727204607.68513: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.68525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.68542: variable 'omit' from source: magic vars 46400 1727204607.68960: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.68979: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204607.69111: variable 'network_provider' from source: set_fact 46400 1727204607.69124: Evaluated conditional (network_provider == "nm"): True 46400 1727204607.69231: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204607.69332: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204607.69633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204607.72353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204607.72427: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204607.72478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204607.72518: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204607.72554: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204607.72652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.72691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.72722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.72776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.72795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.72844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.72876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.72910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.72952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.72971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.73024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204607.73050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204607.73080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.73129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204607.73147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204607.73308: variable 'network_connections' from source: include params 46400 1727204607.73333: variable 'interface' from source: play vars 46400 1727204607.73408: variable 'interface' from source: play vars 46400 1727204607.73481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204607.73634: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204607.73689: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204607.73727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204607.73772: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204607.73850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204607.73884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204607.74055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204607.74091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204607.74155: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204607.74529: variable 'network_connections' from source: include params 46400 1727204607.74583: variable 'interface' from source: play vars 46400 1727204607.74651: variable 'interface' from source: play vars 46400 1727204607.74826: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204607.74834: when evaluation is False, skipping this task 46400 1727204607.74843: _execute() done 46400 1727204607.74851: dumping result to json 46400 1727204607.74858: done dumping result, returning 46400 1727204607.74872: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000001d38] 46400 1727204607.74894: sending task result for task 0affcd87-79f5-1303-fda8-000000001d38 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204607.75065: no more pending results, returning what we have 46400 1727204607.75072: results queue empty 46400 1727204607.75074: checking for any_errors_fatal 46400 1727204607.75093: done checking for any_errors_fatal 46400 1727204607.75094: checking for max_fail_percentage 46400 1727204607.75096: done checking for max_fail_percentage 46400 1727204607.75097: checking to see if all hosts have failed and the running result is not ok 46400 1727204607.75098: done checking to see if all hosts have failed 46400 1727204607.75098: getting the remaining hosts for this loop 46400 1727204607.75100: done getting the remaining hosts for this loop 46400 1727204607.75105: getting the next task for host managed-node2 46400 1727204607.75115: done getting next task for host managed-node2 46400 1727204607.75120: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204607.75128: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204607.75156: getting variables 46400 1727204607.75158: in VariableManager get_vars() 46400 1727204607.75208: Calling all_inventory to load vars for managed-node2 46400 1727204607.75211: Calling groups_inventory to load vars for managed-node2 46400 1727204607.75214: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204607.75226: Calling all_plugins_play to load vars for managed-node2 46400 1727204607.75229: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204607.75232: Calling groups_plugins_play to load vars for managed-node2 46400 1727204607.76570: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d38 46400 1727204607.76575: WORKER PROCESS EXITING 46400 1727204607.78047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204607.79806: done with get_vars() 46400 1727204607.79842: done getting variables 46400 1727204607.79910: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.126) 0:01:38.084 ***** 46400 1727204607.79953: entering _queue_task() for managed-node2/service 46400 1727204607.80336: worker is 1 (out of 1 available) 46400 1727204607.80350: exiting _queue_task() for managed-node2/service 46400 1727204607.80372: done queuing things up, now waiting for results queue to drain 46400 1727204607.80374: waiting for pending results... 46400 1727204607.80692: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204607.80868: in run() - task 0affcd87-79f5-1303-fda8-000000001d39 46400 1727204607.80891: variable 'ansible_search_path' from source: unknown 46400 1727204607.80899: variable 'ansible_search_path' from source: unknown 46400 1727204607.80946: calling self._execute() 46400 1727204607.81066: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.81079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.81093: variable 'omit' from source: magic vars 46400 1727204607.81515: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.81533: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204607.81673: variable 'network_provider' from source: set_fact 46400 1727204607.81691: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204607.81698: when evaluation is False, skipping this task 46400 1727204607.81704: _execute() done 46400 1727204607.81711: dumping result to json 46400 1727204607.81718: done dumping result, returning 46400 1727204607.81727: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000001d39] 46400 1727204607.81737: sending task result for task 0affcd87-79f5-1303-fda8-000000001d39 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204607.81900: no more pending results, returning what we have 46400 1727204607.81905: results queue empty 46400 1727204607.81906: checking for any_errors_fatal 46400 1727204607.81915: done checking for any_errors_fatal 46400 1727204607.81916: checking for max_fail_percentage 46400 1727204607.81918: done checking for max_fail_percentage 46400 1727204607.81919: checking to see if all hosts have failed and the running result is not ok 46400 1727204607.81920: done checking to see if all hosts have failed 46400 1727204607.81921: getting the remaining hosts for this loop 46400 1727204607.81923: done getting the remaining hosts for this loop 46400 1727204607.81927: getting the next task for host managed-node2 46400 1727204607.81938: done getting next task for host managed-node2 46400 1727204607.81943: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204607.81949: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204607.81987: getting variables 46400 1727204607.81989: in VariableManager get_vars() 46400 1727204607.82040: Calling all_inventory to load vars for managed-node2 46400 1727204607.82043: Calling groups_inventory to load vars for managed-node2 46400 1727204607.82046: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204607.82063: Calling all_plugins_play to load vars for managed-node2 46400 1727204607.82068: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204607.82071: Calling groups_plugins_play to load vars for managed-node2 46400 1727204607.83041: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d39 46400 1727204607.83045: WORKER PROCESS EXITING 46400 1727204607.83992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204607.85773: done with get_vars() 46400 1727204607.85805: done getting variables 46400 1727204607.85879: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.059) 0:01:38.143 ***** 46400 1727204607.85918: entering _queue_task() for managed-node2/copy 46400 1727204607.86302: worker is 1 (out of 1 available) 46400 1727204607.86315: exiting _queue_task() for managed-node2/copy 46400 1727204607.86328: done queuing things up, now waiting for results queue to drain 46400 1727204607.86330: waiting for pending results... 46400 1727204607.86663: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204607.86835: in run() - task 0affcd87-79f5-1303-fda8-000000001d3a 46400 1727204607.86862: variable 'ansible_search_path' from source: unknown 46400 1727204607.86875: variable 'ansible_search_path' from source: unknown 46400 1727204607.86923: calling self._execute() 46400 1727204607.87044: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.87058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.87080: variable 'omit' from source: magic vars 46400 1727204607.87499: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.87517: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204607.87656: variable 'network_provider' from source: set_fact 46400 1727204607.87672: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204607.87680: when evaluation is False, skipping this task 46400 1727204607.87686: _execute() done 46400 1727204607.87693: dumping result to json 46400 1727204607.87701: done dumping result, returning 46400 1727204607.87716: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000001d3a] 46400 1727204607.87728: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3a 46400 1727204607.87858: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3a 46400 1727204607.87865: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204607.87912: no more pending results, returning what we have 46400 1727204607.87916: results queue empty 46400 1727204607.87917: checking for any_errors_fatal 46400 1727204607.87926: done checking for any_errors_fatal 46400 1727204607.87927: checking for max_fail_percentage 46400 1727204607.87929: done checking for max_fail_percentage 46400 1727204607.87930: checking to see if all hosts have failed and the running result is not ok 46400 1727204607.87931: done checking to see if all hosts have failed 46400 1727204607.87931: getting the remaining hosts for this loop 46400 1727204607.87933: done getting the remaining hosts for this loop 46400 1727204607.87937: getting the next task for host managed-node2 46400 1727204607.87947: done getting next task for host managed-node2 46400 1727204607.87953: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204607.87958: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204607.87995: getting variables 46400 1727204607.87997: in VariableManager get_vars() 46400 1727204607.88044: Calling all_inventory to load vars for managed-node2 46400 1727204607.88047: Calling groups_inventory to load vars for managed-node2 46400 1727204607.88050: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204607.88065: Calling all_plugins_play to load vars for managed-node2 46400 1727204607.88068: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204607.88071: Calling groups_plugins_play to load vars for managed-node2 46400 1727204607.95615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204607.97600: done with get_vars() 46400 1727204607.97634: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.118) 0:01:38.261 ***** 46400 1727204607.97721: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204607.98098: worker is 1 (out of 1 available) 46400 1727204607.98115: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204607.98127: done queuing things up, now waiting for results queue to drain 46400 1727204607.98129: waiting for pending results... 46400 1727204607.98450: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204607.98647: in run() - task 0affcd87-79f5-1303-fda8-000000001d3b 46400 1727204607.98677: variable 'ansible_search_path' from source: unknown 46400 1727204607.98691: variable 'ansible_search_path' from source: unknown 46400 1727204607.98735: calling self._execute() 46400 1727204607.98876: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204607.98894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204607.98915: variable 'omit' from source: magic vars 46400 1727204607.99391: variable 'ansible_distribution_major_version' from source: facts 46400 1727204607.99415: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204607.99427: variable 'omit' from source: magic vars 46400 1727204607.99506: variable 'omit' from source: magic vars 46400 1727204607.99680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204608.02187: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204608.02282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204608.02327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204608.02373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204608.02407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204608.02502: variable 'network_provider' from source: set_fact 46400 1727204608.02646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204608.02685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204608.02711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204608.02748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204608.02770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204608.02844: variable 'omit' from source: magic vars 46400 1727204608.02976: variable 'omit' from source: magic vars 46400 1727204608.03094: variable 'network_connections' from source: include params 46400 1727204608.03115: variable 'interface' from source: play vars 46400 1727204608.03186: variable 'interface' from source: play vars 46400 1727204608.03382: variable 'omit' from source: magic vars 46400 1727204608.03395: variable '__lsr_ansible_managed' from source: task vars 46400 1727204608.03462: variable '__lsr_ansible_managed' from source: task vars 46400 1727204608.03693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204608.04137: Loaded config def from plugin (lookup/template) 46400 1727204608.04148: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204608.04256: File lookup term: get_ansible_managed.j2 46400 1727204608.04269: variable 'ansible_search_path' from source: unknown 46400 1727204608.04379: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204608.04489: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204608.04512: variable 'ansible_search_path' from source: unknown 46400 1727204608.11685: variable 'ansible_managed' from source: unknown 46400 1727204608.11839: variable 'omit' from source: magic vars 46400 1727204608.11878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204608.11907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204608.11930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204608.11958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.11980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.12011: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204608.12020: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.12029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.12128: Set connection var ansible_shell_type to sh 46400 1727204608.12142: Set connection var ansible_shell_executable to /bin/sh 46400 1727204608.12152: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204608.12167: Set connection var ansible_connection to ssh 46400 1727204608.12180: Set connection var ansible_pipelining to False 46400 1727204608.12191: Set connection var ansible_timeout to 10 46400 1727204608.12220: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.12228: variable 'ansible_connection' from source: unknown 46400 1727204608.12236: variable 'ansible_module_compression' from source: unknown 46400 1727204608.12242: variable 'ansible_shell_type' from source: unknown 46400 1727204608.12248: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.12254: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.12262: variable 'ansible_pipelining' from source: unknown 46400 1727204608.12270: variable 'ansible_timeout' from source: unknown 46400 1727204608.12279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.12395: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204608.12416: variable 'omit' from source: magic vars 46400 1727204608.12425: starting attempt loop 46400 1727204608.12430: running the handler 46400 1727204608.12446: _low_level_execute_command(): starting 46400 1727204608.12455: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204608.13326: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204608.13343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.13363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.13390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.13435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.13448: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204608.13468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.13492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204608.13505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204608.13524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204608.13548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.13571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.13591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.13610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.13622: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204608.13637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.13735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.13762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.13789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.13882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.15545: stdout chunk (state=3): >>>/root <<< 46400 1727204608.15675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.15706: stderr chunk (state=3): >>><<< 46400 1727204608.15710: stdout chunk (state=3): >>><<< 46400 1727204608.15728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204608.15739: _low_level_execute_command(): starting 46400 1727204608.15745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103 `" && echo ansible-tmp-1727204608.157294-53152-27577370796103="` echo /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103 `" ) && sleep 0' 46400 1727204608.16202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.16210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.16262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.16268: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.16271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.16273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.16314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.16330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.16384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.18275: stdout chunk (state=3): >>>ansible-tmp-1727204608.157294-53152-27577370796103=/root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103 <<< 46400 1727204608.18396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.18446: stderr chunk (state=3): >>><<< 46400 1727204608.18448: stdout chunk (state=3): >>><<< 46400 1727204608.18470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.157294-53152-27577370796103=/root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204608.18508: variable 'ansible_module_compression' from source: unknown 46400 1727204608.18545: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204608.18586: variable 'ansible_facts' from source: unknown 46400 1727204608.18679: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/AnsiballZ_network_connections.py 46400 1727204608.18787: Sending initial data 46400 1727204608.18791: Sent initial data (166 bytes) 46400 1727204608.19579: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204608.19588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.19600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.19621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.19666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.19674: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204608.19690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.19702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204608.19710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204608.19722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204608.19730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.19738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.19748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.19755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.19766: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204608.19771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.19855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.19874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.19886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.19958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.21704: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204608.21710: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204608.21717: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 46400 1727204608.21727: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204608.21783: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204608.21813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmptkirwmkd /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/AnsiballZ_network_connections.py <<< 46400 1727204608.21855: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204608.23116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.23304: stderr chunk (state=3): >>><<< 46400 1727204608.23314: stdout chunk (state=3): >>><<< 46400 1727204608.23369: done transferring module to remote 46400 1727204608.23372: _low_level_execute_command(): starting 46400 1727204608.23379: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/ /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/AnsiballZ_network_connections.py && sleep 0' 46400 1727204608.24072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204608.24088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.24106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.24125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.24180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.24192: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204608.24207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.24224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204608.24235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204608.24255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204608.24276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.24291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.24305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.24316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.24326: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204608.24338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.24425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.24449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.24479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.24552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.26358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.26407: stderr chunk (state=3): >>><<< 46400 1727204608.26410: stdout chunk (state=3): >>><<< 46400 1727204608.26446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204608.26449: _low_level_execute_command(): starting 46400 1727204608.26452: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/AnsiballZ_network_connections.py && sleep 0' 46400 1727204608.27111: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204608.27118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.27128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.27142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.27190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.27197: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204608.27207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.27222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204608.27228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204608.27234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204608.27243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.27251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.27275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.27278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.27280: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204608.27300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.27386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.27390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.27430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.27510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.57920: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204608.59487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204608.59583: stderr chunk (state=3): >>><<< 46400 1727204608.59587: stdout chunk (state=3): >>><<< 46400 1727204608.59729: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204608.59733: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204608.59737: _low_level_execute_command(): starting 46400 1727204608.59740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.157294-53152-27577370796103/ > /dev/null 2>&1 && sleep 0' 46400 1727204608.60462: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204608.60484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.60501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.60520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.60561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.60575: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204608.60587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.60602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204608.60612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204608.60620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204608.60629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.60640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.60652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.60667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204608.60677: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204608.60688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.60765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.60792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.60810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.60887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.62809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.62817: stdout chunk (state=3): >>><<< 46400 1727204608.62821: stderr chunk (state=3): >>><<< 46400 1727204608.63171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204608.63175: handler run complete 46400 1727204608.63177: attempt loop complete, returning result 46400 1727204608.63179: _execute() done 46400 1727204608.63181: dumping result to json 46400 1727204608.63183: done dumping result, returning 46400 1727204608.63186: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000001d3b] 46400 1727204608.63188: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3b 46400 1727204608.63281: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3b 46400 1727204608.63284: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 46400 1727204608.63395: no more pending results, returning what we have 46400 1727204608.63400: results queue empty 46400 1727204608.63401: checking for any_errors_fatal 46400 1727204608.63408: done checking for any_errors_fatal 46400 1727204608.63409: checking for max_fail_percentage 46400 1727204608.63411: done checking for max_fail_percentage 46400 1727204608.63412: checking to see if all hosts have failed and the running result is not ok 46400 1727204608.63413: done checking to see if all hosts have failed 46400 1727204608.63413: getting the remaining hosts for this loop 46400 1727204608.63415: done getting the remaining hosts for this loop 46400 1727204608.63419: getting the next task for host managed-node2 46400 1727204608.63427: done getting next task for host managed-node2 46400 1727204608.63431: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204608.63437: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204608.63450: getting variables 46400 1727204608.63452: in VariableManager get_vars() 46400 1727204608.63502: Calling all_inventory to load vars for managed-node2 46400 1727204608.63505: Calling groups_inventory to load vars for managed-node2 46400 1727204608.63508: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204608.63519: Calling all_plugins_play to load vars for managed-node2 46400 1727204608.63522: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204608.63526: Calling groups_plugins_play to load vars for managed-node2 46400 1727204608.66223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204608.69006: done with get_vars() 46400 1727204608.69036: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.714) 0:01:38.975 ***** 46400 1727204608.69142: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204608.69514: worker is 1 (out of 1 available) 46400 1727204608.69526: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204608.69537: done queuing things up, now waiting for results queue to drain 46400 1727204608.69539: waiting for pending results... 46400 1727204608.69840: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204608.70009: in run() - task 0affcd87-79f5-1303-fda8-000000001d3c 46400 1727204608.70031: variable 'ansible_search_path' from source: unknown 46400 1727204608.70040: variable 'ansible_search_path' from source: unknown 46400 1727204608.70091: calling self._execute() 46400 1727204608.70206: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.70217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.70231: variable 'omit' from source: magic vars 46400 1727204608.70638: variable 'ansible_distribution_major_version' from source: facts 46400 1727204608.70654: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204608.70795: variable 'network_state' from source: role '' defaults 46400 1727204608.70813: Evaluated conditional (network_state != {}): False 46400 1727204608.70821: when evaluation is False, skipping this task 46400 1727204608.70827: _execute() done 46400 1727204608.70834: dumping result to json 46400 1727204608.70843: done dumping result, returning 46400 1727204608.70855: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000001d3c] 46400 1727204608.70871: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204608.71029: no more pending results, returning what we have 46400 1727204608.71034: results queue empty 46400 1727204608.71035: checking for any_errors_fatal 46400 1727204608.71055: done checking for any_errors_fatal 46400 1727204608.71057: checking for max_fail_percentage 46400 1727204608.71061: done checking for max_fail_percentage 46400 1727204608.71062: checking to see if all hosts have failed and the running result is not ok 46400 1727204608.71065: done checking to see if all hosts have failed 46400 1727204608.71066: getting the remaining hosts for this loop 46400 1727204608.71067: done getting the remaining hosts for this loop 46400 1727204608.71074: getting the next task for host managed-node2 46400 1727204608.71084: done getting next task for host managed-node2 46400 1727204608.71089: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204608.71094: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204608.71123: getting variables 46400 1727204608.71124: in VariableManager get_vars() 46400 1727204608.71174: Calling all_inventory to load vars for managed-node2 46400 1727204608.71178: Calling groups_inventory to load vars for managed-node2 46400 1727204608.71180: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204608.71194: Calling all_plugins_play to load vars for managed-node2 46400 1727204608.71197: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204608.71200: Calling groups_plugins_play to load vars for managed-node2 46400 1727204608.72311: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3c 46400 1727204608.72314: WORKER PROCESS EXITING 46400 1727204608.73301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204608.74992: done with get_vars() 46400 1727204608.75021: done getting variables 46400 1727204608.75089: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.059) 0:01:39.035 ***** 46400 1727204608.75127: entering _queue_task() for managed-node2/debug 46400 1727204608.75480: worker is 1 (out of 1 available) 46400 1727204608.75494: exiting _queue_task() for managed-node2/debug 46400 1727204608.75508: done queuing things up, now waiting for results queue to drain 46400 1727204608.75510: waiting for pending results... 46400 1727204608.75810: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204608.75988: in run() - task 0affcd87-79f5-1303-fda8-000000001d3d 46400 1727204608.76010: variable 'ansible_search_path' from source: unknown 46400 1727204608.76018: variable 'ansible_search_path' from source: unknown 46400 1727204608.76065: calling self._execute() 46400 1727204608.76171: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.76185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.76199: variable 'omit' from source: magic vars 46400 1727204608.76608: variable 'ansible_distribution_major_version' from source: facts 46400 1727204608.76624: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204608.76634: variable 'omit' from source: magic vars 46400 1727204608.76706: variable 'omit' from source: magic vars 46400 1727204608.76746: variable 'omit' from source: magic vars 46400 1727204608.76796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204608.76842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204608.76873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204608.76895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.76910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.76946: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204608.76955: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.76966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.77070: Set connection var ansible_shell_type to sh 46400 1727204608.77085: Set connection var ansible_shell_executable to /bin/sh 46400 1727204608.77094: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204608.77102: Set connection var ansible_connection to ssh 46400 1727204608.77110: Set connection var ansible_pipelining to False 46400 1727204608.77119: Set connection var ansible_timeout to 10 46400 1727204608.77150: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.77157: variable 'ansible_connection' from source: unknown 46400 1727204608.77169: variable 'ansible_module_compression' from source: unknown 46400 1727204608.77175: variable 'ansible_shell_type' from source: unknown 46400 1727204608.77181: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.77187: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.77193: variable 'ansible_pipelining' from source: unknown 46400 1727204608.77199: variable 'ansible_timeout' from source: unknown 46400 1727204608.77205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.77345: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204608.77367: variable 'omit' from source: magic vars 46400 1727204608.77382: starting attempt loop 46400 1727204608.77389: running the handler 46400 1727204608.77529: variable '__network_connections_result' from source: set_fact 46400 1727204608.77595: handler run complete 46400 1727204608.77617: attempt loop complete, returning result 46400 1727204608.77623: _execute() done 46400 1727204608.77629: dumping result to json 46400 1727204608.77635: done dumping result, returning 46400 1727204608.77647: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-000000001d3d] 46400 1727204608.77656: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3d 46400 1727204608.77770: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3d 46400 1727204608.77778: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 46400 1727204608.77869: no more pending results, returning what we have 46400 1727204608.77873: results queue empty 46400 1727204608.77875: checking for any_errors_fatal 46400 1727204608.77883: done checking for any_errors_fatal 46400 1727204608.77884: checking for max_fail_percentage 46400 1727204608.77886: done checking for max_fail_percentage 46400 1727204608.77887: checking to see if all hosts have failed and the running result is not ok 46400 1727204608.77888: done checking to see if all hosts have failed 46400 1727204608.77889: getting the remaining hosts for this loop 46400 1727204608.77891: done getting the remaining hosts for this loop 46400 1727204608.77895: getting the next task for host managed-node2 46400 1727204608.77904: done getting next task for host managed-node2 46400 1727204608.77909: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204608.77915: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204608.77928: getting variables 46400 1727204608.77930: in VariableManager get_vars() 46400 1727204608.77981: Calling all_inventory to load vars for managed-node2 46400 1727204608.77983: Calling groups_inventory to load vars for managed-node2 46400 1727204608.77986: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204608.77997: Calling all_plugins_play to load vars for managed-node2 46400 1727204608.78000: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204608.78003: Calling groups_plugins_play to load vars for managed-node2 46400 1727204608.79822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204608.81651: done with get_vars() 46400 1727204608.81680: done getting variables 46400 1727204608.81741: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.066) 0:01:39.102 ***** 46400 1727204608.81789: entering _queue_task() for managed-node2/debug 46400 1727204608.82121: worker is 1 (out of 1 available) 46400 1727204608.82134: exiting _queue_task() for managed-node2/debug 46400 1727204608.82146: done queuing things up, now waiting for results queue to drain 46400 1727204608.82148: waiting for pending results... 46400 1727204608.82461: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204608.82641: in run() - task 0affcd87-79f5-1303-fda8-000000001d3e 46400 1727204608.82668: variable 'ansible_search_path' from source: unknown 46400 1727204608.82677: variable 'ansible_search_path' from source: unknown 46400 1727204608.82720: calling self._execute() 46400 1727204608.82826: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.82838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.82851: variable 'omit' from source: magic vars 46400 1727204608.83256: variable 'ansible_distribution_major_version' from source: facts 46400 1727204608.83277: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204608.83288: variable 'omit' from source: magic vars 46400 1727204608.83369: variable 'omit' from source: magic vars 46400 1727204608.83407: variable 'omit' from source: magic vars 46400 1727204608.83454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204608.83500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204608.83525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204608.83545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.83564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.83600: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204608.83609: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.83616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.83719: Set connection var ansible_shell_type to sh 46400 1727204608.83734: Set connection var ansible_shell_executable to /bin/sh 46400 1727204608.83743: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204608.83751: Set connection var ansible_connection to ssh 46400 1727204608.83762: Set connection var ansible_pipelining to False 46400 1727204608.83774: Set connection var ansible_timeout to 10 46400 1727204608.83807: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.83816: variable 'ansible_connection' from source: unknown 46400 1727204608.83822: variable 'ansible_module_compression' from source: unknown 46400 1727204608.83828: variable 'ansible_shell_type' from source: unknown 46400 1727204608.83833: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.83839: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.83845: variable 'ansible_pipelining' from source: unknown 46400 1727204608.83851: variable 'ansible_timeout' from source: unknown 46400 1727204608.83858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.84008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204608.84024: variable 'omit' from source: magic vars 46400 1727204608.84033: starting attempt loop 46400 1727204608.84040: running the handler 46400 1727204608.84095: variable '__network_connections_result' from source: set_fact 46400 1727204608.84189: variable '__network_connections_result' from source: set_fact 46400 1727204608.84310: handler run complete 46400 1727204608.84344: attempt loop complete, returning result 46400 1727204608.84351: _execute() done 46400 1727204608.84362: dumping result to json 46400 1727204608.84374: done dumping result, returning 46400 1727204608.84385: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-000000001d3e] 46400 1727204608.84396: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3e ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 46400 1727204608.84595: no more pending results, returning what we have 46400 1727204608.84599: results queue empty 46400 1727204608.84600: checking for any_errors_fatal 46400 1727204608.84607: done checking for any_errors_fatal 46400 1727204608.84608: checking for max_fail_percentage 46400 1727204608.84609: done checking for max_fail_percentage 46400 1727204608.84610: checking to see if all hosts have failed and the running result is not ok 46400 1727204608.84611: done checking to see if all hosts have failed 46400 1727204608.84612: getting the remaining hosts for this loop 46400 1727204608.84613: done getting the remaining hosts for this loop 46400 1727204608.84617: getting the next task for host managed-node2 46400 1727204608.84626: done getting next task for host managed-node2 46400 1727204608.84630: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204608.84636: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204608.84648: getting variables 46400 1727204608.84649: in VariableManager get_vars() 46400 1727204608.84696: Calling all_inventory to load vars for managed-node2 46400 1727204608.84699: Calling groups_inventory to load vars for managed-node2 46400 1727204608.84701: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204608.84711: Calling all_plugins_play to load vars for managed-node2 46400 1727204608.84719: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204608.84722: Calling groups_plugins_play to load vars for managed-node2 46400 1727204608.85913: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3e 46400 1727204608.85916: WORKER PROCESS EXITING 46400 1727204608.86573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204608.88174: done with get_vars() 46400 1727204608.88195: done getting variables 46400 1727204608.88239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.064) 0:01:39.167 ***** 46400 1727204608.88272: entering _queue_task() for managed-node2/debug 46400 1727204608.88510: worker is 1 (out of 1 available) 46400 1727204608.88525: exiting _queue_task() for managed-node2/debug 46400 1727204608.88539: done queuing things up, now waiting for results queue to drain 46400 1727204608.88540: waiting for pending results... 46400 1727204608.88737: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204608.88845: in run() - task 0affcd87-79f5-1303-fda8-000000001d3f 46400 1727204608.88861: variable 'ansible_search_path' from source: unknown 46400 1727204608.88867: variable 'ansible_search_path' from source: unknown 46400 1727204608.88893: calling self._execute() 46400 1727204608.88973: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.88979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.88989: variable 'omit' from source: magic vars 46400 1727204608.89296: variable 'ansible_distribution_major_version' from source: facts 46400 1727204608.89307: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204608.89398: variable 'network_state' from source: role '' defaults 46400 1727204608.89405: Evaluated conditional (network_state != {}): False 46400 1727204608.89407: when evaluation is False, skipping this task 46400 1727204608.89410: _execute() done 46400 1727204608.89412: dumping result to json 46400 1727204608.89417: done dumping result, returning 46400 1727204608.89424: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-000000001d3f] 46400 1727204608.89429: sending task result for task 0affcd87-79f5-1303-fda8-000000001d3f 46400 1727204608.89519: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d3f 46400 1727204608.89522: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204608.89576: no more pending results, returning what we have 46400 1727204608.89580: results queue empty 46400 1727204608.89581: checking for any_errors_fatal 46400 1727204608.89595: done checking for any_errors_fatal 46400 1727204608.89596: checking for max_fail_percentage 46400 1727204608.89598: done checking for max_fail_percentage 46400 1727204608.89598: checking to see if all hosts have failed and the running result is not ok 46400 1727204608.89599: done checking to see if all hosts have failed 46400 1727204608.89600: getting the remaining hosts for this loop 46400 1727204608.89604: done getting the remaining hosts for this loop 46400 1727204608.89608: getting the next task for host managed-node2 46400 1727204608.89616: done getting next task for host managed-node2 46400 1727204608.89622: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204608.89628: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204608.89657: getting variables 46400 1727204608.89665: in VariableManager get_vars() 46400 1727204608.89718: Calling all_inventory to load vars for managed-node2 46400 1727204608.89721: Calling groups_inventory to load vars for managed-node2 46400 1727204608.89723: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204608.89735: Calling all_plugins_play to load vars for managed-node2 46400 1727204608.89737: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204608.89744: Calling groups_plugins_play to load vars for managed-node2 46400 1727204608.91285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204608.92195: done with get_vars() 46400 1727204608.92212: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.040) 0:01:39.207 ***** 46400 1727204608.92289: entering _queue_task() for managed-node2/ping 46400 1727204608.92519: worker is 1 (out of 1 available) 46400 1727204608.92534: exiting _queue_task() for managed-node2/ping 46400 1727204608.92546: done queuing things up, now waiting for results queue to drain 46400 1727204608.92548: waiting for pending results... 46400 1727204608.92734: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204608.92847: in run() - task 0affcd87-79f5-1303-fda8-000000001d40 46400 1727204608.92873: variable 'ansible_search_path' from source: unknown 46400 1727204608.92881: variable 'ansible_search_path' from source: unknown 46400 1727204608.92914: calling self._execute() 46400 1727204608.93024: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.93035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.93049: variable 'omit' from source: magic vars 46400 1727204608.93475: variable 'ansible_distribution_major_version' from source: facts 46400 1727204608.93493: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204608.93505: variable 'omit' from source: magic vars 46400 1727204608.93591: variable 'omit' from source: magic vars 46400 1727204608.93632: variable 'omit' from source: magic vars 46400 1727204608.93691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204608.93730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204608.93764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204608.93790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.93804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204608.93837: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204608.93846: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.93853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.93958: Set connection var ansible_shell_type to sh 46400 1727204608.93979: Set connection var ansible_shell_executable to /bin/sh 46400 1727204608.94001: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204608.94013: Set connection var ansible_connection to ssh 46400 1727204608.94023: Set connection var ansible_pipelining to False 46400 1727204608.94033: Set connection var ansible_timeout to 10 46400 1727204608.94069: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.94078: variable 'ansible_connection' from source: unknown 46400 1727204608.94085: variable 'ansible_module_compression' from source: unknown 46400 1727204608.94098: variable 'ansible_shell_type' from source: unknown 46400 1727204608.94110: variable 'ansible_shell_executable' from source: unknown 46400 1727204608.94116: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204608.94123: variable 'ansible_pipelining' from source: unknown 46400 1727204608.94128: variable 'ansible_timeout' from source: unknown 46400 1727204608.94134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204608.94353: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204608.94371: variable 'omit' from source: magic vars 46400 1727204608.94380: starting attempt loop 46400 1727204608.94387: running the handler 46400 1727204608.94403: _low_level_execute_command(): starting 46400 1727204608.94417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204608.95102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204608.95112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.95132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.95145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.95199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.95211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.95218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.95279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.96930: stdout chunk (state=3): >>>/root <<< 46400 1727204608.97023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204608.97120: stderr chunk (state=3): >>><<< 46400 1727204608.97135: stdout chunk (state=3): >>><<< 46400 1727204608.97175: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204608.97202: _low_level_execute_command(): starting 46400 1727204608.97213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583 `" && echo ansible-tmp-1727204608.9718213-53198-147381315656583="` echo /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583 `" ) && sleep 0' 46400 1727204608.97912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.97916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204608.97955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204608.97959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204608.97977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204608.98041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204608.98049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204608.98052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204608.98127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204608.99985: stdout chunk (state=3): >>>ansible-tmp-1727204608.9718213-53198-147381315656583=/root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583 <<< 46400 1727204609.00095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.00153: stderr chunk (state=3): >>><<< 46400 1727204609.00156: stdout chunk (state=3): >>><<< 46400 1727204609.00175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.9718213-53198-147381315656583=/root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.00221: variable 'ansible_module_compression' from source: unknown 46400 1727204609.00253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204609.00283: variable 'ansible_facts' from source: unknown 46400 1727204609.00412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/AnsiballZ_ping.py 46400 1727204609.00572: Sending initial data 46400 1727204609.00576: Sent initial data (153 bytes) 46400 1727204609.01642: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.01646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.01679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.01682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.01732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.01745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.01787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.03519: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204609.03551: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204609.03587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpw88c4tnd /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/AnsiballZ_ping.py <<< 46400 1727204609.03621: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204609.04400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.04520: stderr chunk (state=3): >>><<< 46400 1727204609.04525: stdout chunk (state=3): >>><<< 46400 1727204609.04545: done transferring module to remote 46400 1727204609.04554: _low_level_execute_command(): starting 46400 1727204609.04559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/ /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/AnsiballZ_ping.py && sleep 0' 46400 1727204609.05039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.05044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.05085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.05107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.05110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.05114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.05171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.05175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.05177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.05221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.06955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.07017: stderr chunk (state=3): >>><<< 46400 1727204609.07020: stdout chunk (state=3): >>><<< 46400 1727204609.07041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.07044: _low_level_execute_command(): starting 46400 1727204609.07048: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/AnsiballZ_ping.py && sleep 0' 46400 1727204609.07535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.07539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.07569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.07588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.07591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.07642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.07646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.07652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.07699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.20887: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204609.22023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204609.22026: stdout chunk (state=3): >>><<< 46400 1727204609.22028: stderr chunk (state=3): >>><<< 46400 1727204609.22144: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204609.22149: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204609.22151: _low_level_execute_command(): starting 46400 1727204609.22153: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.9718213-53198-147381315656583/ > /dev/null 2>&1 && sleep 0' 46400 1727204609.22883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204609.22900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.22916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.22943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.22994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204609.23007: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204609.23023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.23048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204609.23066: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204609.23079: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204609.23093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.23107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.23124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.23144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204609.23157: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204609.23179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.23381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.23406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.23424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.23501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.25310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.25395: stderr chunk (state=3): >>><<< 46400 1727204609.25410: stdout chunk (state=3): >>><<< 46400 1727204609.25770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.25774: handler run complete 46400 1727204609.25777: attempt loop complete, returning result 46400 1727204609.25779: _execute() done 46400 1727204609.25781: dumping result to json 46400 1727204609.25783: done dumping result, returning 46400 1727204609.25785: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-000000001d40] 46400 1727204609.25788: sending task result for task 0affcd87-79f5-1303-fda8-000000001d40 46400 1727204609.25853: done sending task result for task 0affcd87-79f5-1303-fda8-000000001d40 46400 1727204609.25856: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204609.25926: no more pending results, returning what we have 46400 1727204609.25929: results queue empty 46400 1727204609.25930: checking for any_errors_fatal 46400 1727204609.25936: done checking for any_errors_fatal 46400 1727204609.25937: checking for max_fail_percentage 46400 1727204609.25939: done checking for max_fail_percentage 46400 1727204609.25940: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.25941: done checking to see if all hosts have failed 46400 1727204609.25942: getting the remaining hosts for this loop 46400 1727204609.25943: done getting the remaining hosts for this loop 46400 1727204609.25946: getting the next task for host managed-node2 46400 1727204609.25961: done getting next task for host managed-node2 46400 1727204609.25966: ^ task is: TASK: meta (role_complete) 46400 1727204609.25972: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.25985: getting variables 46400 1727204609.25987: in VariableManager get_vars() 46400 1727204609.26033: Calling all_inventory to load vars for managed-node2 46400 1727204609.26036: Calling groups_inventory to load vars for managed-node2 46400 1727204609.26040: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.26051: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.26054: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.26057: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.27691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.29500: done with get_vars() 46400 1727204609.29531: done getting variables 46400 1727204609.29617: done queuing things up, now waiting for results queue to drain 46400 1727204609.29619: results queue empty 46400 1727204609.29620: checking for any_errors_fatal 46400 1727204609.29623: done checking for any_errors_fatal 46400 1727204609.29623: checking for max_fail_percentage 46400 1727204609.29625: done checking for max_fail_percentage 46400 1727204609.29625: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.29626: done checking to see if all hosts have failed 46400 1727204609.29627: getting the remaining hosts for this loop 46400 1727204609.29628: done getting the remaining hosts for this loop 46400 1727204609.29630: getting the next task for host managed-node2 46400 1727204609.29641: done getting next task for host managed-node2 46400 1727204609.29644: ^ task is: TASK: Asserts 46400 1727204609.29646: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.29649: getting variables 46400 1727204609.29650: in VariableManager get_vars() 46400 1727204609.29666: Calling all_inventory to load vars for managed-node2 46400 1727204609.29668: Calling groups_inventory to load vars for managed-node2 46400 1727204609.29670: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.29675: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.29677: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.29680: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.31101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.32848: done with get_vars() 46400 1727204609.32880: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.406) 0:01:39.614 ***** 46400 1727204609.32972: entering _queue_task() for managed-node2/include_tasks 46400 1727204609.33507: worker is 1 (out of 1 available) 46400 1727204609.33521: exiting _queue_task() for managed-node2/include_tasks 46400 1727204609.33539: done queuing things up, now waiting for results queue to drain 46400 1727204609.33541: waiting for pending results... 46400 1727204609.33877: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204609.34142: in run() - task 0affcd87-79f5-1303-fda8-000000001749 46400 1727204609.34170: variable 'ansible_search_path' from source: unknown 46400 1727204609.34179: variable 'ansible_search_path' from source: unknown 46400 1727204609.34238: variable 'lsr_assert' from source: include params 46400 1727204609.34480: variable 'lsr_assert' from source: include params 46400 1727204609.34562: variable 'omit' from source: magic vars 46400 1727204609.34723: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.34737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.34753: variable 'omit' from source: magic vars 46400 1727204609.34990: variable 'ansible_distribution_major_version' from source: facts 46400 1727204609.35004: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204609.35014: variable 'item' from source: unknown 46400 1727204609.35085: variable 'item' from source: unknown 46400 1727204609.35129: variable 'item' from source: unknown 46400 1727204609.35198: variable 'item' from source: unknown 46400 1727204609.35369: dumping result to json 46400 1727204609.35378: done dumping result, returning 46400 1727204609.35389: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-000000001749] 46400 1727204609.35399: sending task result for task 0affcd87-79f5-1303-fda8-000000001749 46400 1727204609.35489: no more pending results, returning what we have 46400 1727204609.35494: in VariableManager get_vars() 46400 1727204609.35543: Calling all_inventory to load vars for managed-node2 46400 1727204609.35546: Calling groups_inventory to load vars for managed-node2 46400 1727204609.35550: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.35569: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.35573: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.35578: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.36700: done sending task result for task 0affcd87-79f5-1303-fda8-000000001749 46400 1727204609.36704: WORKER PROCESS EXITING 46400 1727204609.37442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.39610: done with get_vars() 46400 1727204609.39631: variable 'ansible_search_path' from source: unknown 46400 1727204609.39633: variable 'ansible_search_path' from source: unknown 46400 1727204609.39681: we have included files to process 46400 1727204609.39683: generating all_blocks data 46400 1727204609.39685: done generating all_blocks data 46400 1727204609.39692: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204609.39697: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204609.39700: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204609.39823: in VariableManager get_vars() 46400 1727204609.39847: done with get_vars() 46400 1727204609.39986: done processing included file 46400 1727204609.39988: iterating over new_blocks loaded from include file 46400 1727204609.39990: in VariableManager get_vars() 46400 1727204609.40008: done with get_vars() 46400 1727204609.40010: filtering new block on tags 46400 1727204609.40065: done filtering new block on tags 46400 1727204609.40067: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 46400 1727204609.40072: extending task lists for all hosts with included blocks 46400 1727204609.41189: done extending task lists 46400 1727204609.41190: done processing included files 46400 1727204609.41191: results queue empty 46400 1727204609.41191: checking for any_errors_fatal 46400 1727204609.41193: done checking for any_errors_fatal 46400 1727204609.41194: checking for max_fail_percentage 46400 1727204609.41195: done checking for max_fail_percentage 46400 1727204609.41196: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.41197: done checking to see if all hosts have failed 46400 1727204609.41198: getting the remaining hosts for this loop 46400 1727204609.41199: done getting the remaining hosts for this loop 46400 1727204609.41201: getting the next task for host managed-node2 46400 1727204609.41206: done getting next task for host managed-node2 46400 1727204609.41208: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204609.41211: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.41213: getting variables 46400 1727204609.41214: in VariableManager get_vars() 46400 1727204609.41231: Calling all_inventory to load vars for managed-node2 46400 1727204609.41234: Calling groups_inventory to load vars for managed-node2 46400 1727204609.41236: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.41242: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.41245: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.41248: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.42603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.44446: done with get_vars() 46400 1727204609.44553: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.116) 0:01:39.731 ***** 46400 1727204609.44654: entering _queue_task() for managed-node2/include_tasks 46400 1727204609.45036: worker is 1 (out of 1 available) 46400 1727204609.45048: exiting _queue_task() for managed-node2/include_tasks 46400 1727204609.45065: done queuing things up, now waiting for results queue to drain 46400 1727204609.45067: waiting for pending results... 46400 1727204609.45377: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204609.45529: in run() - task 0affcd87-79f5-1303-fda8-000000001e99 46400 1727204609.45549: variable 'ansible_search_path' from source: unknown 46400 1727204609.45558: variable 'ansible_search_path' from source: unknown 46400 1727204609.45605: calling self._execute() 46400 1727204609.45718: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.45736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.45750: variable 'omit' from source: magic vars 46400 1727204609.46154: variable 'ansible_distribution_major_version' from source: facts 46400 1727204609.46183: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204609.46193: _execute() done 46400 1727204609.46201: dumping result to json 46400 1727204609.46207: done dumping result, returning 46400 1727204609.46215: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-000000001e99] 46400 1727204609.46225: sending task result for task 0affcd87-79f5-1303-fda8-000000001e99 46400 1727204609.46354: no more pending results, returning what we have 46400 1727204609.46362: in VariableManager get_vars() 46400 1727204609.46418: Calling all_inventory to load vars for managed-node2 46400 1727204609.46421: Calling groups_inventory to load vars for managed-node2 46400 1727204609.46425: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.46440: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.46444: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.46447: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.47505: done sending task result for task 0affcd87-79f5-1303-fda8-000000001e99 46400 1727204609.47509: WORKER PROCESS EXITING 46400 1727204609.48489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.50290: done with get_vars() 46400 1727204609.50316: variable 'ansible_search_path' from source: unknown 46400 1727204609.50317: variable 'ansible_search_path' from source: unknown 46400 1727204609.50327: variable 'item' from source: include params 46400 1727204609.50456: variable 'item' from source: include params 46400 1727204609.50496: we have included files to process 46400 1727204609.50497: generating all_blocks data 46400 1727204609.50499: done generating all_blocks data 46400 1727204609.50500: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204609.50501: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204609.50503: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204609.51484: done processing included file 46400 1727204609.51486: iterating over new_blocks loaded from include file 46400 1727204609.51488: in VariableManager get_vars() 46400 1727204609.51508: done with get_vars() 46400 1727204609.51510: filtering new block on tags 46400 1727204609.51592: done filtering new block on tags 46400 1727204609.51596: in VariableManager get_vars() 46400 1727204609.51612: done with get_vars() 46400 1727204609.51614: filtering new block on tags 46400 1727204609.51683: done filtering new block on tags 46400 1727204609.51686: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204609.51691: extending task lists for all hosts with included blocks 46400 1727204609.51942: done extending task lists 46400 1727204609.51943: done processing included files 46400 1727204609.51944: results queue empty 46400 1727204609.51945: checking for any_errors_fatal 46400 1727204609.51949: done checking for any_errors_fatal 46400 1727204609.51949: checking for max_fail_percentage 46400 1727204609.51950: done checking for max_fail_percentage 46400 1727204609.51951: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.51952: done checking to see if all hosts have failed 46400 1727204609.51953: getting the remaining hosts for this loop 46400 1727204609.51954: done getting the remaining hosts for this loop 46400 1727204609.51956: getting the next task for host managed-node2 46400 1727204609.51962: done getting next task for host managed-node2 46400 1727204609.51965: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204609.51969: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.51971: getting variables 46400 1727204609.51972: in VariableManager get_vars() 46400 1727204609.51985: Calling all_inventory to load vars for managed-node2 46400 1727204609.51987: Calling groups_inventory to load vars for managed-node2 46400 1727204609.51989: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.51995: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.51997: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.51999: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.53416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.55151: done with get_vars() 46400 1727204609.55187: done getting variables 46400 1727204609.55239: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.106) 0:01:39.837 ***** 46400 1727204609.55278: entering _queue_task() for managed-node2/set_fact 46400 1727204609.55645: worker is 1 (out of 1 available) 46400 1727204609.55656: exiting _queue_task() for managed-node2/set_fact 46400 1727204609.55677: done queuing things up, now waiting for results queue to drain 46400 1727204609.55678: waiting for pending results... 46400 1727204609.55980: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204609.56142: in run() - task 0affcd87-79f5-1303-fda8-000000001f17 46400 1727204609.56165: variable 'ansible_search_path' from source: unknown 46400 1727204609.56173: variable 'ansible_search_path' from source: unknown 46400 1727204609.56217: calling self._execute() 46400 1727204609.56329: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.56342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.56354: variable 'omit' from source: magic vars 46400 1727204609.56770: variable 'ansible_distribution_major_version' from source: facts 46400 1727204609.56792: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204609.56804: variable 'omit' from source: magic vars 46400 1727204609.56868: variable 'omit' from source: magic vars 46400 1727204609.56911: variable 'omit' from source: magic vars 46400 1727204609.56964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204609.57013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204609.57041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204609.57068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204609.57089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204609.57128: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204609.57137: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.57145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.57257: Set connection var ansible_shell_type to sh 46400 1727204609.57280: Set connection var ansible_shell_executable to /bin/sh 46400 1727204609.57290: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204609.57304: Set connection var ansible_connection to ssh 46400 1727204609.57319: Set connection var ansible_pipelining to False 46400 1727204609.57329: Set connection var ansible_timeout to 10 46400 1727204609.57357: variable 'ansible_shell_executable' from source: unknown 46400 1727204609.57370: variable 'ansible_connection' from source: unknown 46400 1727204609.57378: variable 'ansible_module_compression' from source: unknown 46400 1727204609.57384: variable 'ansible_shell_type' from source: unknown 46400 1727204609.57391: variable 'ansible_shell_executable' from source: unknown 46400 1727204609.57398: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.57410: variable 'ansible_pipelining' from source: unknown 46400 1727204609.57420: variable 'ansible_timeout' from source: unknown 46400 1727204609.57427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.57584: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204609.57603: variable 'omit' from source: magic vars 46400 1727204609.57615: starting attempt loop 46400 1727204609.57627: running the handler 46400 1727204609.57649: handler run complete 46400 1727204609.57668: attempt loop complete, returning result 46400 1727204609.57675: _execute() done 46400 1727204609.57681: dumping result to json 46400 1727204609.57687: done dumping result, returning 46400 1727204609.57696: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-000000001f17] 46400 1727204609.57705: sending task result for task 0affcd87-79f5-1303-fda8-000000001f17 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204609.57863: no more pending results, returning what we have 46400 1727204609.57869: results queue empty 46400 1727204609.57871: checking for any_errors_fatal 46400 1727204609.57873: done checking for any_errors_fatal 46400 1727204609.57874: checking for max_fail_percentage 46400 1727204609.57876: done checking for max_fail_percentage 46400 1727204609.57877: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.57878: done checking to see if all hosts have failed 46400 1727204609.57879: getting the remaining hosts for this loop 46400 1727204609.57880: done getting the remaining hosts for this loop 46400 1727204609.57884: getting the next task for host managed-node2 46400 1727204609.57896: done getting next task for host managed-node2 46400 1727204609.57899: ^ task is: TASK: Stat profile file 46400 1727204609.57906: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.57912: getting variables 46400 1727204609.57914: in VariableManager get_vars() 46400 1727204609.57967: Calling all_inventory to load vars for managed-node2 46400 1727204609.57970: Calling groups_inventory to load vars for managed-node2 46400 1727204609.57974: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.57987: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.57989: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.57992: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.58985: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f17 46400 1727204609.58988: WORKER PROCESS EXITING 46400 1727204609.59947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204609.61705: done with get_vars() 46400 1727204609.61735: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.065) 0:01:39.903 ***** 46400 1727204609.61846: entering _queue_task() for managed-node2/stat 46400 1727204609.62220: worker is 1 (out of 1 available) 46400 1727204609.62234: exiting _queue_task() for managed-node2/stat 46400 1727204609.62247: done queuing things up, now waiting for results queue to drain 46400 1727204609.62249: waiting for pending results... 46400 1727204609.62576: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204609.62723: in run() - task 0affcd87-79f5-1303-fda8-000000001f18 46400 1727204609.62751: variable 'ansible_search_path' from source: unknown 46400 1727204609.62763: variable 'ansible_search_path' from source: unknown 46400 1727204609.62812: calling self._execute() 46400 1727204609.62932: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.62944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.62966: variable 'omit' from source: magic vars 46400 1727204609.63380: variable 'ansible_distribution_major_version' from source: facts 46400 1727204609.63404: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204609.63417: variable 'omit' from source: magic vars 46400 1727204609.63489: variable 'omit' from source: magic vars 46400 1727204609.63605: variable 'profile' from source: play vars 46400 1727204609.63621: variable 'interface' from source: play vars 46400 1727204609.63699: variable 'interface' from source: play vars 46400 1727204609.63726: variable 'omit' from source: magic vars 46400 1727204609.63781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204609.63809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204609.63828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204609.63843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204609.63852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204609.63881: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204609.63894: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.63897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.63958: Set connection var ansible_shell_type to sh 46400 1727204609.63971: Set connection var ansible_shell_executable to /bin/sh 46400 1727204609.63977: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204609.63981: Set connection var ansible_connection to ssh 46400 1727204609.63986: Set connection var ansible_pipelining to False 46400 1727204609.63992: Set connection var ansible_timeout to 10 46400 1727204609.64012: variable 'ansible_shell_executable' from source: unknown 46400 1727204609.64017: variable 'ansible_connection' from source: unknown 46400 1727204609.64020: variable 'ansible_module_compression' from source: unknown 46400 1727204609.64022: variable 'ansible_shell_type' from source: unknown 46400 1727204609.64024: variable 'ansible_shell_executable' from source: unknown 46400 1727204609.64026: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204609.64031: variable 'ansible_pipelining' from source: unknown 46400 1727204609.64033: variable 'ansible_timeout' from source: unknown 46400 1727204609.64038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204609.64195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204609.64204: variable 'omit' from source: magic vars 46400 1727204609.64209: starting attempt loop 46400 1727204609.64211: running the handler 46400 1727204609.64225: _low_level_execute_command(): starting 46400 1727204609.64232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204609.64744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.64756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.64786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204609.64800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.64851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.64866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.64924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.66575: stdout chunk (state=3): >>>/root <<< 46400 1727204609.66685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.66730: stderr chunk (state=3): >>><<< 46400 1727204609.66733: stdout chunk (state=3): >>><<< 46400 1727204609.66752: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.66770: _low_level_execute_command(): starting 46400 1727204609.66777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838 `" && echo ansible-tmp-1727204609.6675131-53220-251965358849838="` echo /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838 `" ) && sleep 0' 46400 1727204609.67195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.67202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.67233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.67255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.67301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.67315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.67365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.69217: stdout chunk (state=3): >>>ansible-tmp-1727204609.6675131-53220-251965358849838=/root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838 <<< 46400 1727204609.69402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.69405: stdout chunk (state=3): >>><<< 46400 1727204609.69413: stderr chunk (state=3): >>><<< 46400 1727204609.69440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204609.6675131-53220-251965358849838=/root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.69496: variable 'ansible_module_compression' from source: unknown 46400 1727204609.69566: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204609.69604: variable 'ansible_facts' from source: unknown 46400 1727204609.69701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/AnsiballZ_stat.py 46400 1727204609.69994: Sending initial data 46400 1727204609.70008: Sent initial data (153 bytes) 46400 1727204609.71111: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.71117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.71155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.71159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.71175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.71181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204609.71260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.71302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.71323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.71337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.71409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.73218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204609.73262: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204609.73298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpuexklrqa /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/AnsiballZ_stat.py <<< 46400 1727204609.74070: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204609.74386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.74492: stderr chunk (state=3): >>><<< 46400 1727204609.74496: stdout chunk (state=3): >>><<< 46400 1727204609.74513: done transferring module to remote 46400 1727204609.74521: _low_level_execute_command(): starting 46400 1727204609.74526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/ /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/AnsiballZ_stat.py && sleep 0' 46400 1727204609.74948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.74954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.75000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.75004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.75007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.75058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.75072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.75117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.76948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.76952: stderr chunk (state=3): >>><<< 46400 1727204609.76954: stdout chunk (state=3): >>><<< 46400 1727204609.76971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.76976: _low_level_execute_command(): starting 46400 1727204609.76979: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/AnsiballZ_stat.py && sleep 0' 46400 1727204609.77683: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204609.77721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.77751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204609.77759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.77786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204609.77790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204609.77792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.77842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.77846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.77906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.91213: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204609.92333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204609.92337: stderr chunk (state=3): >>><<< 46400 1727204609.92343: stdout chunk (state=3): >>><<< 46400 1727204609.92374: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204609.92402: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204609.92410: _low_level_execute_command(): starting 46400 1727204609.92415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204609.6675131-53220-251965358849838/ > /dev/null 2>&1 && sleep 0' 46400 1727204609.93074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.93079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204609.93118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.93121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204609.93136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204609.93144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204609.93219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204609.93232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204609.93238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204609.93301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204609.95507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204609.95576: stderr chunk (state=3): >>><<< 46400 1727204609.95580: stdout chunk (state=3): >>><<< 46400 1727204609.95596: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204609.95603: handler run complete 46400 1727204609.95627: attempt loop complete, returning result 46400 1727204609.95630: _execute() done 46400 1727204609.95632: dumping result to json 46400 1727204609.95634: done dumping result, returning 46400 1727204609.95644: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-000000001f18] 46400 1727204609.95650: sending task result for task 0affcd87-79f5-1303-fda8-000000001f18 46400 1727204609.95757: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f18 46400 1727204609.95760: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204609.95814: no more pending results, returning what we have 46400 1727204609.95818: results queue empty 46400 1727204609.95819: checking for any_errors_fatal 46400 1727204609.95826: done checking for any_errors_fatal 46400 1727204609.95827: checking for max_fail_percentage 46400 1727204609.95829: done checking for max_fail_percentage 46400 1727204609.95830: checking to see if all hosts have failed and the running result is not ok 46400 1727204609.95831: done checking to see if all hosts have failed 46400 1727204609.95831: getting the remaining hosts for this loop 46400 1727204609.95833: done getting the remaining hosts for this loop 46400 1727204609.95837: getting the next task for host managed-node2 46400 1727204609.95846: done getting next task for host managed-node2 46400 1727204609.95848: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204609.95853: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204609.95858: getting variables 46400 1727204609.95859: in VariableManager get_vars() 46400 1727204609.95905: Calling all_inventory to load vars for managed-node2 46400 1727204609.95908: Calling groups_inventory to load vars for managed-node2 46400 1727204609.95911: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204609.95922: Calling all_plugins_play to load vars for managed-node2 46400 1727204609.95925: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204609.95927: Calling groups_plugins_play to load vars for managed-node2 46400 1727204609.98123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.02178: done with get_vars() 46400 1727204610.02204: done getting variables 46400 1727204610.02267: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.404) 0:01:40.307 ***** 46400 1727204610.02304: entering _queue_task() for managed-node2/set_fact 46400 1727204610.03763: worker is 1 (out of 1 available) 46400 1727204610.03779: exiting _queue_task() for managed-node2/set_fact 46400 1727204610.03792: done queuing things up, now waiting for results queue to drain 46400 1727204610.03793: waiting for pending results... 46400 1727204610.04302: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204610.04544: in run() - task 0affcd87-79f5-1303-fda8-000000001f19 46400 1727204610.04677: variable 'ansible_search_path' from source: unknown 46400 1727204610.04684: variable 'ansible_search_path' from source: unknown 46400 1727204610.04724: calling self._execute() 46400 1727204610.04834: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.05008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.05021: variable 'omit' from source: magic vars 46400 1727204610.05621: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.05774: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.06010: variable 'profile_stat' from source: set_fact 46400 1727204610.06025: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204610.06032: when evaluation is False, skipping this task 46400 1727204610.06040: _execute() done 46400 1727204610.06087: dumping result to json 46400 1727204610.06096: done dumping result, returning 46400 1727204610.06106: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-000000001f19] 46400 1727204610.06116: sending task result for task 0affcd87-79f5-1303-fda8-000000001f19 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204610.06263: no more pending results, returning what we have 46400 1727204610.06270: results queue empty 46400 1727204610.06271: checking for any_errors_fatal 46400 1727204610.06284: done checking for any_errors_fatal 46400 1727204610.06285: checking for max_fail_percentage 46400 1727204610.06287: done checking for max_fail_percentage 46400 1727204610.06288: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.06289: done checking to see if all hosts have failed 46400 1727204610.06289: getting the remaining hosts for this loop 46400 1727204610.06291: done getting the remaining hosts for this loop 46400 1727204610.06295: getting the next task for host managed-node2 46400 1727204610.06305: done getting next task for host managed-node2 46400 1727204610.06308: ^ task is: TASK: Get NM profile info 46400 1727204610.06313: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.06317: getting variables 46400 1727204610.06319: in VariableManager get_vars() 46400 1727204610.06368: Calling all_inventory to load vars for managed-node2 46400 1727204610.06372: Calling groups_inventory to load vars for managed-node2 46400 1727204610.06376: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.06391: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.06394: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.06398: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.07724: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f19 46400 1727204610.07728: WORKER PROCESS EXITING 46400 1727204610.09725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.11498: done with get_vars() 46400 1727204610.11531: done getting variables 46400 1727204610.11604: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.093) 0:01:40.401 ***** 46400 1727204610.11648: entering _queue_task() for managed-node2/shell 46400 1727204610.12024: worker is 1 (out of 1 available) 46400 1727204610.12039: exiting _queue_task() for managed-node2/shell 46400 1727204610.12053: done queuing things up, now waiting for results queue to drain 46400 1727204610.12055: waiting for pending results... 46400 1727204610.12359: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204610.12500: in run() - task 0affcd87-79f5-1303-fda8-000000001f1a 46400 1727204610.12520: variable 'ansible_search_path' from source: unknown 46400 1727204610.12528: variable 'ansible_search_path' from source: unknown 46400 1727204610.12568: calling self._execute() 46400 1727204610.12677: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.12689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.12702: variable 'omit' from source: magic vars 46400 1727204610.13083: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.13099: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.13109: variable 'omit' from source: magic vars 46400 1727204610.13177: variable 'omit' from source: magic vars 46400 1727204610.13292: variable 'profile' from source: play vars 46400 1727204610.13302: variable 'interface' from source: play vars 46400 1727204610.13374: variable 'interface' from source: play vars 46400 1727204610.13399: variable 'omit' from source: magic vars 46400 1727204610.13451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204610.13498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204610.13528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204610.13552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.13576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.13616: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204610.13625: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.13632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.13735: Set connection var ansible_shell_type to sh 46400 1727204610.13749: Set connection var ansible_shell_executable to /bin/sh 46400 1727204610.13758: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204610.13769: Set connection var ansible_connection to ssh 46400 1727204610.13778: Set connection var ansible_pipelining to False 46400 1727204610.13787: Set connection var ansible_timeout to 10 46400 1727204610.13819: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.13826: variable 'ansible_connection' from source: unknown 46400 1727204610.13832: variable 'ansible_module_compression' from source: unknown 46400 1727204610.13837: variable 'ansible_shell_type' from source: unknown 46400 1727204610.13843: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.13849: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.13855: variable 'ansible_pipelining' from source: unknown 46400 1727204610.13861: variable 'ansible_timeout' from source: unknown 46400 1727204610.13870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.14018: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204610.14036: variable 'omit' from source: magic vars 46400 1727204610.14047: starting attempt loop 46400 1727204610.14053: running the handler 46400 1727204610.14069: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204610.14095: _low_level_execute_command(): starting 46400 1727204610.14107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204610.14841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204610.14859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.14878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.14900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.14943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.14954: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204610.14973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.14997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204610.15008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204610.15019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204610.15031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.15046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.15063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.15079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.15091: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204610.15111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.15186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.15203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.15221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.15304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.16975: stdout chunk (state=3): >>>/root <<< 46400 1727204610.17193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.17196: stdout chunk (state=3): >>><<< 46400 1727204610.17199: stderr chunk (state=3): >>><<< 46400 1727204610.17271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204610.17282: _low_level_execute_command(): starting 46400 1727204610.17285: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180 `" && echo ansible-tmp-1727204610.172224-53248-122644364823180="` echo /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180 `" ) && sleep 0' 46400 1727204610.19076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204610.19192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.19210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.19226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.19272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.19285: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204610.19298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.19318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204610.19328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204610.19337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204610.19347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.19358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.19374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.19385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.19396: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204610.19408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.19489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.19510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.19529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.19604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.21499: stdout chunk (state=3): >>>ansible-tmp-1727204610.172224-53248-122644364823180=/root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180 <<< 46400 1727204610.21711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.21715: stdout chunk (state=3): >>><<< 46400 1727204610.21717: stderr chunk (state=3): >>><<< 46400 1727204610.22061: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204610.172224-53248-122644364823180=/root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204610.22067: variable 'ansible_module_compression' from source: unknown 46400 1727204610.22069: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204610.22072: variable 'ansible_facts' from source: unknown 46400 1727204610.22074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/AnsiballZ_command.py 46400 1727204610.22623: Sending initial data 46400 1727204610.22627: Sent initial data (155 bytes) 46400 1727204610.24820: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204610.24902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.24919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.24950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.24999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.25048: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204610.25119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.25138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204610.25163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204610.25181: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204610.25193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.25207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.25223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.25234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.25246: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204610.25269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.25348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.25500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.25519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.25618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.27426: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204610.27461: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204610.27505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpcpbvikyl /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/AnsiballZ_command.py <<< 46400 1727204610.27538: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204610.28993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.29067: stderr chunk (state=3): >>><<< 46400 1727204610.29072: stdout chunk (state=3): >>><<< 46400 1727204610.29098: done transferring module to remote 46400 1727204610.29106: _low_level_execute_command(): starting 46400 1727204610.29112: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/ /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/AnsiballZ_command.py && sleep 0' 46400 1727204610.30312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204610.30335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.30344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.30358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.30398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.30405: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204610.30414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.30427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204610.30434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204610.30441: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204610.30449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.30457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.30471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.30478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.30485: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204610.30494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.30566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.30580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.30590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.30662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.32516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.32519: stdout chunk (state=3): >>><<< 46400 1727204610.32522: stderr chunk (state=3): >>><<< 46400 1727204610.32622: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204610.32626: _low_level_execute_command(): starting 46400 1727204610.32628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/AnsiballZ_command.py && sleep 0' 46400 1727204610.33360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204610.33585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.33595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.33612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.33645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.33652: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204610.33661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.33678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204610.33685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204610.33691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204610.33698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.33708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.33718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.33726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204610.33729: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204610.33738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.34123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.34137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.34150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.34223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.49152: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:30.472555", "end": "2024-09-24 15:03:30.490565", "delta": "0:00:00.018010", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204610.50350: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204610.50410: stderr chunk (state=3): >>><<< 46400 1727204610.50414: stdout chunk (state=3): >>><<< 46400 1727204610.50430: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:30.472555", "end": "2024-09-24 15:03:30.490565", "delta": "0:00:00.018010", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204610.50465: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204610.50476: _low_level_execute_command(): starting 46400 1727204610.50482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204610.172224-53248-122644364823180/ > /dev/null 2>&1 && sleep 0' 46400 1727204610.50943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.50947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.50995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.50999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204610.51001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.51062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.51068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.51071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.51111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.52895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.52943: stderr chunk (state=3): >>><<< 46400 1727204610.52946: stdout chunk (state=3): >>><<< 46400 1727204610.52965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204610.52974: handler run complete 46400 1727204610.52992: Evaluated conditional (False): False 46400 1727204610.53000: attempt loop complete, returning result 46400 1727204610.53003: _execute() done 46400 1727204610.53005: dumping result to json 46400 1727204610.53010: done dumping result, returning 46400 1727204610.53017: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-000000001f1a] 46400 1727204610.53022: sending task result for task 0affcd87-79f5-1303-fda8-000000001f1a 46400 1727204610.53120: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f1a 46400 1727204610.53123: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018010", "end": "2024-09-24 15:03:30.490565", "rc": 1, "start": "2024-09-24 15:03:30.472555" } MSG: non-zero return code ...ignoring 46400 1727204610.53193: no more pending results, returning what we have 46400 1727204610.53197: results queue empty 46400 1727204610.53198: checking for any_errors_fatal 46400 1727204610.53204: done checking for any_errors_fatal 46400 1727204610.53205: checking for max_fail_percentage 46400 1727204610.53207: done checking for max_fail_percentage 46400 1727204610.53207: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.53208: done checking to see if all hosts have failed 46400 1727204610.53209: getting the remaining hosts for this loop 46400 1727204610.53210: done getting the remaining hosts for this loop 46400 1727204610.53215: getting the next task for host managed-node2 46400 1727204610.53223: done getting next task for host managed-node2 46400 1727204610.53226: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204610.53230: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.53235: getting variables 46400 1727204610.53237: in VariableManager get_vars() 46400 1727204610.53281: Calling all_inventory to load vars for managed-node2 46400 1727204610.53284: Calling groups_inventory to load vars for managed-node2 46400 1727204610.53288: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.53298: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.53301: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.53303: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.54314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.55218: done with get_vars() 46400 1727204610.55236: done getting variables 46400 1727204610.55286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.436) 0:01:40.837 ***** 46400 1727204610.55317: entering _queue_task() for managed-node2/set_fact 46400 1727204610.55559: worker is 1 (out of 1 available) 46400 1727204610.55574: exiting _queue_task() for managed-node2/set_fact 46400 1727204610.55588: done queuing things up, now waiting for results queue to drain 46400 1727204610.55590: waiting for pending results... 46400 1727204610.55785: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204610.55859: in run() - task 0affcd87-79f5-1303-fda8-000000001f1b 46400 1727204610.55875: variable 'ansible_search_path' from source: unknown 46400 1727204610.55879: variable 'ansible_search_path' from source: unknown 46400 1727204610.55909: calling self._execute() 46400 1727204610.55991: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.55994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.56004: variable 'omit' from source: magic vars 46400 1727204610.56291: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.56301: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.56397: variable 'nm_profile_exists' from source: set_fact 46400 1727204610.56407: Evaluated conditional (nm_profile_exists.rc == 0): False 46400 1727204610.56410: when evaluation is False, skipping this task 46400 1727204610.56413: _execute() done 46400 1727204610.56416: dumping result to json 46400 1727204610.56418: done dumping result, returning 46400 1727204610.56424: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-000000001f1b] 46400 1727204610.56429: sending task result for task 0affcd87-79f5-1303-fda8-000000001f1b 46400 1727204610.56523: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f1b 46400 1727204610.56527: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 46400 1727204610.56583: no more pending results, returning what we have 46400 1727204610.56587: results queue empty 46400 1727204610.56588: checking for any_errors_fatal 46400 1727204610.56599: done checking for any_errors_fatal 46400 1727204610.56600: checking for max_fail_percentage 46400 1727204610.56601: done checking for max_fail_percentage 46400 1727204610.56602: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.56603: done checking to see if all hosts have failed 46400 1727204610.56604: getting the remaining hosts for this loop 46400 1727204610.56605: done getting the remaining hosts for this loop 46400 1727204610.56609: getting the next task for host managed-node2 46400 1727204610.56620: done getting next task for host managed-node2 46400 1727204610.56623: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204610.56628: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.56631: getting variables 46400 1727204610.56633: in VariableManager get_vars() 46400 1727204610.56677: Calling all_inventory to load vars for managed-node2 46400 1727204610.56680: Calling groups_inventory to load vars for managed-node2 46400 1727204610.56684: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.56694: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.56696: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.56699: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.57504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.58414: done with get_vars() 46400 1727204610.58430: done getting variables 46400 1727204610.58477: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204610.58567: variable 'profile' from source: play vars 46400 1727204610.58570: variable 'interface' from source: play vars 46400 1727204610.58614: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.033) 0:01:40.870 ***** 46400 1727204610.58638: entering _queue_task() for managed-node2/command 46400 1727204610.58873: worker is 1 (out of 1 available) 46400 1727204610.58885: exiting _queue_task() for managed-node2/command 46400 1727204610.58899: done queuing things up, now waiting for results queue to drain 46400 1727204610.58900: waiting for pending results... 46400 1727204610.59090: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204610.59182: in run() - task 0affcd87-79f5-1303-fda8-000000001f1d 46400 1727204610.59194: variable 'ansible_search_path' from source: unknown 46400 1727204610.59197: variable 'ansible_search_path' from source: unknown 46400 1727204610.59225: calling self._execute() 46400 1727204610.59304: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.59308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.59317: variable 'omit' from source: magic vars 46400 1727204610.59588: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.59597: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.59685: variable 'profile_stat' from source: set_fact 46400 1727204610.59695: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204610.59699: when evaluation is False, skipping this task 46400 1727204610.59703: _execute() done 46400 1727204610.59705: dumping result to json 46400 1727204610.59708: done dumping result, returning 46400 1727204610.59710: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000001f1d] 46400 1727204610.59718: sending task result for task 0affcd87-79f5-1303-fda8-000000001f1d 46400 1727204610.59803: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f1d 46400 1727204610.59805: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204610.59858: no more pending results, returning what we have 46400 1727204610.59863: results queue empty 46400 1727204610.59865: checking for any_errors_fatal 46400 1727204610.59871: done checking for any_errors_fatal 46400 1727204610.59872: checking for max_fail_percentage 46400 1727204610.59873: done checking for max_fail_percentage 46400 1727204610.59874: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.59875: done checking to see if all hosts have failed 46400 1727204610.59876: getting the remaining hosts for this loop 46400 1727204610.59878: done getting the remaining hosts for this loop 46400 1727204610.59882: getting the next task for host managed-node2 46400 1727204610.59890: done getting next task for host managed-node2 46400 1727204610.59893: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204610.59897: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.59901: getting variables 46400 1727204610.59902: in VariableManager get_vars() 46400 1727204610.59946: Calling all_inventory to load vars for managed-node2 46400 1727204610.59949: Calling groups_inventory to load vars for managed-node2 46400 1727204610.59951: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.59961: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.59965: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.59968: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.60886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.61792: done with get_vars() 46400 1727204610.61808: done getting variables 46400 1727204610.61849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204610.61930: variable 'profile' from source: play vars 46400 1727204610.61933: variable 'interface' from source: play vars 46400 1727204610.61974: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.033) 0:01:40.904 ***** 46400 1727204610.62000: entering _queue_task() for managed-node2/set_fact 46400 1727204610.62223: worker is 1 (out of 1 available) 46400 1727204610.62238: exiting _queue_task() for managed-node2/set_fact 46400 1727204610.62251: done queuing things up, now waiting for results queue to drain 46400 1727204610.62253: waiting for pending results... 46400 1727204610.62440: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204610.62520: in run() - task 0affcd87-79f5-1303-fda8-000000001f1e 46400 1727204610.62534: variable 'ansible_search_path' from source: unknown 46400 1727204610.62538: variable 'ansible_search_path' from source: unknown 46400 1727204610.62571: calling self._execute() 46400 1727204610.62647: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.62651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.62659: variable 'omit' from source: magic vars 46400 1727204610.62929: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.62938: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.63027: variable 'profile_stat' from source: set_fact 46400 1727204610.63037: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204610.63040: when evaluation is False, skipping this task 46400 1727204610.63043: _execute() done 46400 1727204610.63046: dumping result to json 46400 1727204610.63049: done dumping result, returning 46400 1727204610.63054: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000001f1e] 46400 1727204610.63059: sending task result for task 0affcd87-79f5-1303-fda8-000000001f1e 46400 1727204610.63150: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f1e 46400 1727204610.63154: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204610.63206: no more pending results, returning what we have 46400 1727204610.63211: results queue empty 46400 1727204610.63212: checking for any_errors_fatal 46400 1727204610.63221: done checking for any_errors_fatal 46400 1727204610.63221: checking for max_fail_percentage 46400 1727204610.63223: done checking for max_fail_percentage 46400 1727204610.63224: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.63225: done checking to see if all hosts have failed 46400 1727204610.63226: getting the remaining hosts for this loop 46400 1727204610.63227: done getting the remaining hosts for this loop 46400 1727204610.63231: getting the next task for host managed-node2 46400 1727204610.63241: done getting next task for host managed-node2 46400 1727204610.63244: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204610.63248: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.63251: getting variables 46400 1727204610.63252: in VariableManager get_vars() 46400 1727204610.63292: Calling all_inventory to load vars for managed-node2 46400 1727204610.63299: Calling groups_inventory to load vars for managed-node2 46400 1727204610.63302: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.63312: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.63314: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.63316: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.64106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.65110: done with get_vars() 46400 1727204610.65126: done getting variables 46400 1727204610.65173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204610.65252: variable 'profile' from source: play vars 46400 1727204610.65255: variable 'interface' from source: play vars 46400 1727204610.65296: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.033) 0:01:40.937 ***** 46400 1727204610.65319: entering _queue_task() for managed-node2/command 46400 1727204610.65548: worker is 1 (out of 1 available) 46400 1727204610.65566: exiting _queue_task() for managed-node2/command 46400 1727204610.65578: done queuing things up, now waiting for results queue to drain 46400 1727204610.65580: waiting for pending results... 46400 1727204610.65760: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204610.65854: in run() - task 0affcd87-79f5-1303-fda8-000000001f1f 46400 1727204610.65870: variable 'ansible_search_path' from source: unknown 46400 1727204610.65873: variable 'ansible_search_path' from source: unknown 46400 1727204610.65901: calling self._execute() 46400 1727204610.65982: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.65987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.65997: variable 'omit' from source: magic vars 46400 1727204610.66270: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.66281: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.66368: variable 'profile_stat' from source: set_fact 46400 1727204610.66379: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204610.66382: when evaluation is False, skipping this task 46400 1727204610.66385: _execute() done 46400 1727204610.66388: dumping result to json 46400 1727204610.66390: done dumping result, returning 46400 1727204610.66393: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000001f1f] 46400 1727204610.66400: sending task result for task 0affcd87-79f5-1303-fda8-000000001f1f 46400 1727204610.66496: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f1f 46400 1727204610.66499: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204610.66544: no more pending results, returning what we have 46400 1727204610.66549: results queue empty 46400 1727204610.66550: checking for any_errors_fatal 46400 1727204610.66559: done checking for any_errors_fatal 46400 1727204610.66560: checking for max_fail_percentage 46400 1727204610.66561: done checking for max_fail_percentage 46400 1727204610.66562: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.66563: done checking to see if all hosts have failed 46400 1727204610.66565: getting the remaining hosts for this loop 46400 1727204610.66567: done getting the remaining hosts for this loop 46400 1727204610.66571: getting the next task for host managed-node2 46400 1727204610.66585: done getting next task for host managed-node2 46400 1727204610.66589: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204610.66593: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.66598: getting variables 46400 1727204610.66599: in VariableManager get_vars() 46400 1727204610.66637: Calling all_inventory to load vars for managed-node2 46400 1727204610.66640: Calling groups_inventory to load vars for managed-node2 46400 1727204610.66643: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.66654: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.66657: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.66660: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.71793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.73183: done with get_vars() 46400 1727204610.73210: done getting variables 46400 1727204610.73266: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204610.73371: variable 'profile' from source: play vars 46400 1727204610.73374: variable 'interface' from source: play vars 46400 1727204610.73436: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.081) 0:01:41.019 ***** 46400 1727204610.73468: entering _queue_task() for managed-node2/set_fact 46400 1727204610.73829: worker is 1 (out of 1 available) 46400 1727204610.73842: exiting _queue_task() for managed-node2/set_fact 46400 1727204610.73855: done queuing things up, now waiting for results queue to drain 46400 1727204610.73858: waiting for pending results... 46400 1727204610.74188: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204610.74350: in run() - task 0affcd87-79f5-1303-fda8-000000001f20 46400 1727204610.74378: variable 'ansible_search_path' from source: unknown 46400 1727204610.74387: variable 'ansible_search_path' from source: unknown 46400 1727204610.74434: calling self._execute() 46400 1727204610.74547: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.74565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.74582: variable 'omit' from source: magic vars 46400 1727204610.74996: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.75015: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.75150: variable 'profile_stat' from source: set_fact 46400 1727204610.75172: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204610.75181: when evaluation is False, skipping this task 46400 1727204610.75188: _execute() done 46400 1727204610.75196: dumping result to json 46400 1727204610.75203: done dumping result, returning 46400 1727204610.75213: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000001f20] 46400 1727204610.75224: sending task result for task 0affcd87-79f5-1303-fda8-000000001f20 46400 1727204610.75327: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f20 46400 1727204610.75331: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204610.75376: no more pending results, returning what we have 46400 1727204610.75380: results queue empty 46400 1727204610.75381: checking for any_errors_fatal 46400 1727204610.75392: done checking for any_errors_fatal 46400 1727204610.75393: checking for max_fail_percentage 46400 1727204610.75395: done checking for max_fail_percentage 46400 1727204610.75396: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.75396: done checking to see if all hosts have failed 46400 1727204610.75397: getting the remaining hosts for this loop 46400 1727204610.75399: done getting the remaining hosts for this loop 46400 1727204610.75403: getting the next task for host managed-node2 46400 1727204610.75414: done getting next task for host managed-node2 46400 1727204610.75417: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 46400 1727204610.75421: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.75436: getting variables 46400 1727204610.75437: in VariableManager get_vars() 46400 1727204610.75483: Calling all_inventory to load vars for managed-node2 46400 1727204610.75486: Calling groups_inventory to load vars for managed-node2 46400 1727204610.75490: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.75502: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.75504: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.75507: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.76427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.77356: done with get_vars() 46400 1727204610.77374: done getting variables 46400 1727204610.77419: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204610.77510: variable 'profile' from source: play vars 46400 1727204610.77514: variable 'interface' from source: play vars 46400 1727204610.77558: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.041) 0:01:41.060 ***** 46400 1727204610.77583: entering _queue_task() for managed-node2/assert 46400 1727204610.77816: worker is 1 (out of 1 available) 46400 1727204610.77830: exiting _queue_task() for managed-node2/assert 46400 1727204610.77842: done queuing things up, now waiting for results queue to drain 46400 1727204610.77844: waiting for pending results... 46400 1727204610.78034: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 46400 1727204610.78126: in run() - task 0affcd87-79f5-1303-fda8-000000001e9a 46400 1727204610.78137: variable 'ansible_search_path' from source: unknown 46400 1727204610.78141: variable 'ansible_search_path' from source: unknown 46400 1727204610.78172: calling self._execute() 46400 1727204610.78256: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.78262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.78273: variable 'omit' from source: magic vars 46400 1727204610.78556: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.78569: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.78575: variable 'omit' from source: magic vars 46400 1727204610.78610: variable 'omit' from source: magic vars 46400 1727204610.78686: variable 'profile' from source: play vars 46400 1727204610.78689: variable 'interface' from source: play vars 46400 1727204610.78735: variable 'interface' from source: play vars 46400 1727204610.78750: variable 'omit' from source: magic vars 46400 1727204610.78789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204610.78816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204610.78835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204610.78851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.78860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.78886: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204610.78890: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.78893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.78959: Set connection var ansible_shell_type to sh 46400 1727204610.78972: Set connection var ansible_shell_executable to /bin/sh 46400 1727204610.78977: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204610.78982: Set connection var ansible_connection to ssh 46400 1727204610.78987: Set connection var ansible_pipelining to False 46400 1727204610.78992: Set connection var ansible_timeout to 10 46400 1727204610.79011: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.79014: variable 'ansible_connection' from source: unknown 46400 1727204610.79017: variable 'ansible_module_compression' from source: unknown 46400 1727204610.79019: variable 'ansible_shell_type' from source: unknown 46400 1727204610.79022: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.79025: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.79027: variable 'ansible_pipelining' from source: unknown 46400 1727204610.79030: variable 'ansible_timeout' from source: unknown 46400 1727204610.79033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.79137: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204610.79150: variable 'omit' from source: magic vars 46400 1727204610.79154: starting attempt loop 46400 1727204610.79158: running the handler 46400 1727204610.79238: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204610.79242: Evaluated conditional (not lsr_net_profile_exists): True 46400 1727204610.79248: handler run complete 46400 1727204610.79261: attempt loop complete, returning result 46400 1727204610.79269: _execute() done 46400 1727204610.79272: dumping result to json 46400 1727204610.79277: done dumping result, returning 46400 1727204610.79280: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [0affcd87-79f5-1303-fda8-000000001e9a] 46400 1727204610.79286: sending task result for task 0affcd87-79f5-1303-fda8-000000001e9a 46400 1727204610.79367: done sending task result for task 0affcd87-79f5-1303-fda8-000000001e9a 46400 1727204610.79371: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204610.79433: no more pending results, returning what we have 46400 1727204610.79437: results queue empty 46400 1727204610.79438: checking for any_errors_fatal 46400 1727204610.79445: done checking for any_errors_fatal 46400 1727204610.79446: checking for max_fail_percentage 46400 1727204610.79447: done checking for max_fail_percentage 46400 1727204610.79448: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.79449: done checking to see if all hosts have failed 46400 1727204610.79450: getting the remaining hosts for this loop 46400 1727204610.79451: done getting the remaining hosts for this loop 46400 1727204610.79455: getting the next task for host managed-node2 46400 1727204610.79468: done getting next task for host managed-node2 46400 1727204610.79471: ^ task is: TASK: Conditional asserts 46400 1727204610.79473: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.79478: getting variables 46400 1727204610.79479: in VariableManager get_vars() 46400 1727204610.79519: Calling all_inventory to load vars for managed-node2 46400 1727204610.79521: Calling groups_inventory to load vars for managed-node2 46400 1727204610.79525: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.79535: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.79537: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.79540: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.80337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.81259: done with get_vars() 46400 1727204610.81277: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.037) 0:01:41.098 ***** 46400 1727204610.81345: entering _queue_task() for managed-node2/include_tasks 46400 1727204610.81567: worker is 1 (out of 1 available) 46400 1727204610.81582: exiting _queue_task() for managed-node2/include_tasks 46400 1727204610.81595: done queuing things up, now waiting for results queue to drain 46400 1727204610.81597: waiting for pending results... 46400 1727204610.81772: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204610.81847: in run() - task 0affcd87-79f5-1303-fda8-00000000174a 46400 1727204610.81862: variable 'ansible_search_path' from source: unknown 46400 1727204610.81869: variable 'ansible_search_path' from source: unknown 46400 1727204610.82078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204610.84060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204610.84111: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204610.84142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204610.84171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204610.84192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204610.84254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204610.84279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204610.84296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204610.84325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204610.84336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204610.84422: variable 'lsr_assert_when' from source: include params 46400 1727204610.84518: variable 'network_provider' from source: set_fact 46400 1727204610.84579: variable 'omit' from source: magic vars 46400 1727204610.84667: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.84681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.84687: variable 'omit' from source: magic vars 46400 1727204610.84834: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.84842: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.84927: variable 'item' from source: unknown 46400 1727204610.84934: Evaluated conditional (item['condition']): True 46400 1727204610.84993: variable 'item' from source: unknown 46400 1727204610.85022: variable 'item' from source: unknown 46400 1727204610.85068: variable 'item' from source: unknown 46400 1727204610.85203: dumping result to json 46400 1727204610.85205: done dumping result, returning 46400 1727204610.85207: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-00000000174a] 46400 1727204610.85209: sending task result for task 0affcd87-79f5-1303-fda8-00000000174a 46400 1727204610.85250: done sending task result for task 0affcd87-79f5-1303-fda8-00000000174a 46400 1727204610.85253: WORKER PROCESS EXITING 46400 1727204610.85285: no more pending results, returning what we have 46400 1727204610.85290: in VariableManager get_vars() 46400 1727204610.85339: Calling all_inventory to load vars for managed-node2 46400 1727204610.85342: Calling groups_inventory to load vars for managed-node2 46400 1727204610.85345: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.85356: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.85368: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.85373: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.86403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.87311: done with get_vars() 46400 1727204610.87331: variable 'ansible_search_path' from source: unknown 46400 1727204610.87333: variable 'ansible_search_path' from source: unknown 46400 1727204610.87366: we have included files to process 46400 1727204610.87367: generating all_blocks data 46400 1727204610.87369: done generating all_blocks data 46400 1727204610.87373: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204610.87374: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204610.87375: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204610.87455: in VariableManager get_vars() 46400 1727204610.87476: done with get_vars() 46400 1727204610.87557: done processing included file 46400 1727204610.87561: iterating over new_blocks loaded from include file 46400 1727204610.87562: in VariableManager get_vars() 46400 1727204610.87576: done with get_vars() 46400 1727204610.87577: filtering new block on tags 46400 1727204610.87602: done filtering new block on tags 46400 1727204610.87604: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 46400 1727204610.87608: extending task lists for all hosts with included blocks 46400 1727204610.88320: done extending task lists 46400 1727204610.88321: done processing included files 46400 1727204610.88322: results queue empty 46400 1727204610.88322: checking for any_errors_fatal 46400 1727204610.88325: done checking for any_errors_fatal 46400 1727204610.88325: checking for max_fail_percentage 46400 1727204610.88326: done checking for max_fail_percentage 46400 1727204610.88327: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.88327: done checking to see if all hosts have failed 46400 1727204610.88328: getting the remaining hosts for this loop 46400 1727204610.88329: done getting the remaining hosts for this loop 46400 1727204610.88330: getting the next task for host managed-node2 46400 1727204610.88333: done getting next task for host managed-node2 46400 1727204610.88335: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204610.88337: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.88344: getting variables 46400 1727204610.88345: in VariableManager get_vars() 46400 1727204610.88353: Calling all_inventory to load vars for managed-node2 46400 1727204610.88355: Calling groups_inventory to load vars for managed-node2 46400 1727204610.88356: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.88365: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.88367: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.88369: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.89092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.90079: done with get_vars() 46400 1727204610.90095: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.088) 0:01:41.186 ***** 46400 1727204610.90155: entering _queue_task() for managed-node2/include_tasks 46400 1727204610.90416: worker is 1 (out of 1 available) 46400 1727204610.90429: exiting _queue_task() for managed-node2/include_tasks 46400 1727204610.90442: done queuing things up, now waiting for results queue to drain 46400 1727204610.90444: waiting for pending results... 46400 1727204610.90633: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204610.90722: in run() - task 0affcd87-79f5-1303-fda8-000000001f59 46400 1727204610.90732: variable 'ansible_search_path' from source: unknown 46400 1727204610.90735: variable 'ansible_search_path' from source: unknown 46400 1727204610.90766: calling self._execute() 46400 1727204610.90852: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.90856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.90871: variable 'omit' from source: magic vars 46400 1727204610.91162: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.91176: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.91182: _execute() done 46400 1727204610.91186: dumping result to json 46400 1727204610.91188: done dumping result, returning 46400 1727204610.91194: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-000000001f59] 46400 1727204610.91200: sending task result for task 0affcd87-79f5-1303-fda8-000000001f59 46400 1727204610.91296: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f59 46400 1727204610.91298: WORKER PROCESS EXITING 46400 1727204610.91347: no more pending results, returning what we have 46400 1727204610.91352: in VariableManager get_vars() 46400 1727204610.91405: Calling all_inventory to load vars for managed-node2 46400 1727204610.91408: Calling groups_inventory to load vars for managed-node2 46400 1727204610.91411: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.91424: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.91427: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.91429: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.92278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.93205: done with get_vars() 46400 1727204610.93220: variable 'ansible_search_path' from source: unknown 46400 1727204610.93221: variable 'ansible_search_path' from source: unknown 46400 1727204610.93330: variable 'item' from source: include params 46400 1727204610.93355: we have included files to process 46400 1727204610.93356: generating all_blocks data 46400 1727204610.93358: done generating all_blocks data 46400 1727204610.93359: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204610.93360: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204610.93362: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204610.93490: done processing included file 46400 1727204610.93492: iterating over new_blocks loaded from include file 46400 1727204610.93493: in VariableManager get_vars() 46400 1727204610.93506: done with get_vars() 46400 1727204610.93508: filtering new block on tags 46400 1727204610.93525: done filtering new block on tags 46400 1727204610.93527: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204610.93530: extending task lists for all hosts with included blocks 46400 1727204610.93633: done extending task lists 46400 1727204610.93634: done processing included files 46400 1727204610.93634: results queue empty 46400 1727204610.93635: checking for any_errors_fatal 46400 1727204610.93638: done checking for any_errors_fatal 46400 1727204610.93639: checking for max_fail_percentage 46400 1727204610.93639: done checking for max_fail_percentage 46400 1727204610.93640: checking to see if all hosts have failed and the running result is not ok 46400 1727204610.93641: done checking to see if all hosts have failed 46400 1727204610.93641: getting the remaining hosts for this loop 46400 1727204610.93642: done getting the remaining hosts for this loop 46400 1727204610.93643: getting the next task for host managed-node2 46400 1727204610.93646: done getting next task for host managed-node2 46400 1727204610.93648: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204610.93650: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204610.93652: getting variables 46400 1727204610.93652: in VariableManager get_vars() 46400 1727204610.93660: Calling all_inventory to load vars for managed-node2 46400 1727204610.93662: Calling groups_inventory to load vars for managed-node2 46400 1727204610.93665: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204610.93669: Calling all_plugins_play to load vars for managed-node2 46400 1727204610.93670: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204610.93672: Calling groups_plugins_play to load vars for managed-node2 46400 1727204610.94399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204610.95295: done with get_vars() 46400 1727204610.95311: done getting variables 46400 1727204610.95404: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.052) 0:01:41.238 ***** 46400 1727204610.95427: entering _queue_task() for managed-node2/stat 46400 1727204610.95687: worker is 1 (out of 1 available) 46400 1727204610.95702: exiting _queue_task() for managed-node2/stat 46400 1727204610.95715: done queuing things up, now waiting for results queue to drain 46400 1727204610.95716: waiting for pending results... 46400 1727204610.95906: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204610.96001: in run() - task 0affcd87-79f5-1303-fda8-000000001fe8 46400 1727204610.96008: variable 'ansible_search_path' from source: unknown 46400 1727204610.96011: variable 'ansible_search_path' from source: unknown 46400 1727204610.96042: calling self._execute() 46400 1727204610.96121: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.96127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.96135: variable 'omit' from source: magic vars 46400 1727204610.96419: variable 'ansible_distribution_major_version' from source: facts 46400 1727204610.96432: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204610.96436: variable 'omit' from source: magic vars 46400 1727204610.96484: variable 'omit' from source: magic vars 46400 1727204610.96554: variable 'interface' from source: play vars 46400 1727204610.96579: variable 'omit' from source: magic vars 46400 1727204610.96612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204610.96639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204610.96658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204610.96676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.96686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204610.96710: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204610.96714: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.96716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.96787: Set connection var ansible_shell_type to sh 46400 1727204610.96795: Set connection var ansible_shell_executable to /bin/sh 46400 1727204610.96801: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204610.96806: Set connection var ansible_connection to ssh 46400 1727204610.96812: Set connection var ansible_pipelining to False 46400 1727204610.96817: Set connection var ansible_timeout to 10 46400 1727204610.96836: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.96840: variable 'ansible_connection' from source: unknown 46400 1727204610.96843: variable 'ansible_module_compression' from source: unknown 46400 1727204610.96845: variable 'ansible_shell_type' from source: unknown 46400 1727204610.96847: variable 'ansible_shell_executable' from source: unknown 46400 1727204610.96849: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204610.96852: variable 'ansible_pipelining' from source: unknown 46400 1727204610.96854: variable 'ansible_timeout' from source: unknown 46400 1727204610.96858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204610.97015: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204610.97024: variable 'omit' from source: magic vars 46400 1727204610.97029: starting attempt loop 46400 1727204610.97032: running the handler 46400 1727204610.97044: _low_level_execute_command(): starting 46400 1727204610.97051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204610.97581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204610.97597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204610.97614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204610.97633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204610.97685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204610.97694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204610.97705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204610.97762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204610.99418: stdout chunk (state=3): >>>/root <<< 46400 1727204610.99520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204610.99580: stderr chunk (state=3): >>><<< 46400 1727204610.99584: stdout chunk (state=3): >>><<< 46400 1727204610.99605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204610.99617: _low_level_execute_command(): starting 46400 1727204610.99623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920 `" && echo ansible-tmp-1727204610.9960535-53288-87153609472920="` echo /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920 `" ) && sleep 0' 46400 1727204611.00085: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.00103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.00114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204611.00129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.00153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.00192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.00205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.00251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.02137: stdout chunk (state=3): >>>ansible-tmp-1727204610.9960535-53288-87153609472920=/root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920 <<< 46400 1727204611.02258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.02311: stderr chunk (state=3): >>><<< 46400 1727204611.02315: stdout chunk (state=3): >>><<< 46400 1727204611.02331: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204610.9960535-53288-87153609472920=/root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.02375: variable 'ansible_module_compression' from source: unknown 46400 1727204611.02427: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204611.02456: variable 'ansible_facts' from source: unknown 46400 1727204611.02523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/AnsiballZ_stat.py 46400 1727204611.02633: Sending initial data 46400 1727204611.02637: Sent initial data (152 bytes) 46400 1727204611.03320: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.03326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.03362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.03376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.03387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204611.03393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.03449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.03456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.03506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.05214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204611.05248: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204611.05284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpfe7ru899 /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/AnsiballZ_stat.py <<< 46400 1727204611.05322: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204611.06095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.06205: stderr chunk (state=3): >>><<< 46400 1727204611.06208: stdout chunk (state=3): >>><<< 46400 1727204611.06224: done transferring module to remote 46400 1727204611.06234: _low_level_execute_command(): starting 46400 1727204611.06237: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/ /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/AnsiballZ_stat.py && sleep 0' 46400 1727204611.06697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.06702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.06739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.06752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.06803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.06815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.06862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.08591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.08640: stderr chunk (state=3): >>><<< 46400 1727204611.08643: stdout chunk (state=3): >>><<< 46400 1727204611.08662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.08672: _low_level_execute_command(): starting 46400 1727204611.08674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/AnsiballZ_stat.py && sleep 0' 46400 1727204611.09112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.09117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.09149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.09162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.09176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.09226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.09233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.09305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.22514: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204611.23524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204611.23591: stderr chunk (state=3): >>><<< 46400 1727204611.23595: stdout chunk (state=3): >>><<< 46400 1727204611.23611: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204611.23635: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204611.23645: _low_level_execute_command(): starting 46400 1727204611.23650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204610.9960535-53288-87153609472920/ > /dev/null 2>&1 && sleep 0' 46400 1727204611.24126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.24130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.24174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.24186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.24234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.24248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.24298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.26095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.26147: stderr chunk (state=3): >>><<< 46400 1727204611.26151: stdout chunk (state=3): >>><<< 46400 1727204611.26169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.26173: handler run complete 46400 1727204611.26193: attempt loop complete, returning result 46400 1727204611.26196: _execute() done 46400 1727204611.26199: dumping result to json 46400 1727204611.26201: done dumping result, returning 46400 1727204611.26208: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000001fe8] 46400 1727204611.26215: sending task result for task 0affcd87-79f5-1303-fda8-000000001fe8 46400 1727204611.26315: done sending task result for task 0affcd87-79f5-1303-fda8-000000001fe8 46400 1727204611.26318: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204611.26379: no more pending results, returning what we have 46400 1727204611.26384: results queue empty 46400 1727204611.26385: checking for any_errors_fatal 46400 1727204611.26387: done checking for any_errors_fatal 46400 1727204611.26387: checking for max_fail_percentage 46400 1727204611.26389: done checking for max_fail_percentage 46400 1727204611.26390: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.26391: done checking to see if all hosts have failed 46400 1727204611.26392: getting the remaining hosts for this loop 46400 1727204611.26393: done getting the remaining hosts for this loop 46400 1727204611.26398: getting the next task for host managed-node2 46400 1727204611.26408: done getting next task for host managed-node2 46400 1727204611.26410: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 46400 1727204611.26414: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.26419: getting variables 46400 1727204611.26421: in VariableManager get_vars() 46400 1727204611.26472: Calling all_inventory to load vars for managed-node2 46400 1727204611.26475: Calling groups_inventory to load vars for managed-node2 46400 1727204611.26478: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.26489: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.26492: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.26494: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.27372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.28297: done with get_vars() 46400 1727204611.28317: done getting variables 46400 1727204611.28365: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204611.28457: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.330) 0:01:41.569 ***** 46400 1727204611.28485: entering _queue_task() for managed-node2/assert 46400 1727204611.28720: worker is 1 (out of 1 available) 46400 1727204611.28733: exiting _queue_task() for managed-node2/assert 46400 1727204611.28744: done queuing things up, now waiting for results queue to drain 46400 1727204611.28746: waiting for pending results... 46400 1727204611.28931: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 46400 1727204611.29016: in run() - task 0affcd87-79f5-1303-fda8-000000001f5a 46400 1727204611.29026: variable 'ansible_search_path' from source: unknown 46400 1727204611.29029: variable 'ansible_search_path' from source: unknown 46400 1727204611.29057: calling self._execute() 46400 1727204611.29135: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.29138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.29147: variable 'omit' from source: magic vars 46400 1727204611.29420: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.29430: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.29436: variable 'omit' from source: magic vars 46400 1727204611.29471: variable 'omit' from source: magic vars 46400 1727204611.29540: variable 'interface' from source: play vars 46400 1727204611.29553: variable 'omit' from source: magic vars 46400 1727204611.29591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204611.29620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204611.29638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204611.29652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.29663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.29686: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204611.29689: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.29692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.29766: Set connection var ansible_shell_type to sh 46400 1727204611.29773: Set connection var ansible_shell_executable to /bin/sh 46400 1727204611.29778: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204611.29783: Set connection var ansible_connection to ssh 46400 1727204611.29788: Set connection var ansible_pipelining to False 46400 1727204611.29793: Set connection var ansible_timeout to 10 46400 1727204611.29811: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.29814: variable 'ansible_connection' from source: unknown 46400 1727204611.29819: variable 'ansible_module_compression' from source: unknown 46400 1727204611.29821: variable 'ansible_shell_type' from source: unknown 46400 1727204611.29824: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.29826: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.29830: variable 'ansible_pipelining' from source: unknown 46400 1727204611.29832: variable 'ansible_timeout' from source: unknown 46400 1727204611.29834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.29934: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204611.29944: variable 'omit' from source: magic vars 46400 1727204611.29947: starting attempt loop 46400 1727204611.29950: running the handler 46400 1727204611.30050: variable 'interface_stat' from source: set_fact 46400 1727204611.30058: Evaluated conditional (not interface_stat.stat.exists): True 46400 1727204611.30065: handler run complete 46400 1727204611.30078: attempt loop complete, returning result 46400 1727204611.30082: _execute() done 46400 1727204611.30084: dumping result to json 46400 1727204611.30087: done dumping result, returning 46400 1727204611.30092: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [0affcd87-79f5-1303-fda8-000000001f5a] 46400 1727204611.30098: sending task result for task 0affcd87-79f5-1303-fda8-000000001f5a 46400 1727204611.30190: done sending task result for task 0affcd87-79f5-1303-fda8-000000001f5a 46400 1727204611.30192: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204611.30235: no more pending results, returning what we have 46400 1727204611.30240: results queue empty 46400 1727204611.30241: checking for any_errors_fatal 46400 1727204611.30252: done checking for any_errors_fatal 46400 1727204611.30252: checking for max_fail_percentage 46400 1727204611.30254: done checking for max_fail_percentage 46400 1727204611.30255: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.30256: done checking to see if all hosts have failed 46400 1727204611.30257: getting the remaining hosts for this loop 46400 1727204611.30261: done getting the remaining hosts for this loop 46400 1727204611.30267: getting the next task for host managed-node2 46400 1727204611.30277: done getting next task for host managed-node2 46400 1727204611.30280: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204611.30282: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.30292: getting variables 46400 1727204611.30294: in VariableManager get_vars() 46400 1727204611.30330: Calling all_inventory to load vars for managed-node2 46400 1727204611.30333: Calling groups_inventory to load vars for managed-node2 46400 1727204611.30336: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.30346: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.30349: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.30351: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.31320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.32235: done with get_vars() 46400 1727204611.32254: done getting variables 46400 1727204611.32301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204611.32392: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.039) 0:01:41.608 ***** 46400 1727204611.32415: entering _queue_task() for managed-node2/debug 46400 1727204611.32658: worker is 1 (out of 1 available) 46400 1727204611.32676: exiting _queue_task() for managed-node2/debug 46400 1727204611.32687: done queuing things up, now waiting for results queue to drain 46400 1727204611.32689: waiting for pending results... 46400 1727204611.32878: running TaskExecutor() for managed-node2/TASK: Success in test 'I can take a profile down that is absent' 46400 1727204611.32950: in run() - task 0affcd87-79f5-1303-fda8-00000000174b 46400 1727204611.32966: variable 'ansible_search_path' from source: unknown 46400 1727204611.32970: variable 'ansible_search_path' from source: unknown 46400 1727204611.32995: calling self._execute() 46400 1727204611.33071: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.33075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.33081: variable 'omit' from source: magic vars 46400 1727204611.33352: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.33365: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.33370: variable 'omit' from source: magic vars 46400 1727204611.33400: variable 'omit' from source: magic vars 46400 1727204611.33472: variable 'lsr_description' from source: include params 46400 1727204611.33486: variable 'omit' from source: magic vars 46400 1727204611.33521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204611.33550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204611.33571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204611.33584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.33593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.33616: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204611.33619: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.33622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.33695: Set connection var ansible_shell_type to sh 46400 1727204611.33703: Set connection var ansible_shell_executable to /bin/sh 46400 1727204611.33708: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204611.33714: Set connection var ansible_connection to ssh 46400 1727204611.33718: Set connection var ansible_pipelining to False 46400 1727204611.33723: Set connection var ansible_timeout to 10 46400 1727204611.33742: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.33745: variable 'ansible_connection' from source: unknown 46400 1727204611.33748: variable 'ansible_module_compression' from source: unknown 46400 1727204611.33750: variable 'ansible_shell_type' from source: unknown 46400 1727204611.33752: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.33754: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.33757: variable 'ansible_pipelining' from source: unknown 46400 1727204611.33763: variable 'ansible_timeout' from source: unknown 46400 1727204611.33767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.33868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204611.33875: variable 'omit' from source: magic vars 46400 1727204611.33880: starting attempt loop 46400 1727204611.33884: running the handler 46400 1727204611.33919: handler run complete 46400 1727204611.33930: attempt loop complete, returning result 46400 1727204611.33933: _execute() done 46400 1727204611.33936: dumping result to json 46400 1727204611.33938: done dumping result, returning 46400 1727204611.33943: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can take a profile down that is absent' [0affcd87-79f5-1303-fda8-00000000174b] 46400 1727204611.33948: sending task result for task 0affcd87-79f5-1303-fda8-00000000174b 46400 1727204611.34035: done sending task result for task 0affcd87-79f5-1303-fda8-00000000174b 46400 1727204611.34039: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 46400 1727204611.34085: no more pending results, returning what we have 46400 1727204611.34090: results queue empty 46400 1727204611.34091: checking for any_errors_fatal 46400 1727204611.34098: done checking for any_errors_fatal 46400 1727204611.34099: checking for max_fail_percentage 46400 1727204611.34101: done checking for max_fail_percentage 46400 1727204611.34108: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.34109: done checking to see if all hosts have failed 46400 1727204611.34109: getting the remaining hosts for this loop 46400 1727204611.34111: done getting the remaining hosts for this loop 46400 1727204611.34116: getting the next task for host managed-node2 46400 1727204611.34123: done getting next task for host managed-node2 46400 1727204611.34126: ^ task is: TASK: Cleanup 46400 1727204611.34128: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.34133: getting variables 46400 1727204611.34135: in VariableManager get_vars() 46400 1727204611.34177: Calling all_inventory to load vars for managed-node2 46400 1727204611.34180: Calling groups_inventory to load vars for managed-node2 46400 1727204611.34183: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.34194: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.34196: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.34198: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.35047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.35983: done with get_vars() 46400 1727204611.36000: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.036) 0:01:41.645 ***** 46400 1727204611.36074: entering _queue_task() for managed-node2/include_tasks 46400 1727204611.36313: worker is 1 (out of 1 available) 46400 1727204611.36327: exiting _queue_task() for managed-node2/include_tasks 46400 1727204611.36339: done queuing things up, now waiting for results queue to drain 46400 1727204611.36341: waiting for pending results... 46400 1727204611.36531: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204611.36602: in run() - task 0affcd87-79f5-1303-fda8-00000000174f 46400 1727204611.36615: variable 'ansible_search_path' from source: unknown 46400 1727204611.36619: variable 'ansible_search_path' from source: unknown 46400 1727204611.36654: variable 'lsr_cleanup' from source: include params 46400 1727204611.36812: variable 'lsr_cleanup' from source: include params 46400 1727204611.36868: variable 'omit' from source: magic vars 46400 1727204611.36977: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.36985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.36993: variable 'omit' from source: magic vars 46400 1727204611.37166: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.37180: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.37186: variable 'item' from source: unknown 46400 1727204611.37235: variable 'item' from source: unknown 46400 1727204611.37259: variable 'item' from source: unknown 46400 1727204611.37311: variable 'item' from source: unknown 46400 1727204611.37425: dumping result to json 46400 1727204611.37428: done dumping result, returning 46400 1727204611.37430: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-00000000174f] 46400 1727204611.37432: sending task result for task 0affcd87-79f5-1303-fda8-00000000174f 46400 1727204611.37468: done sending task result for task 0affcd87-79f5-1303-fda8-00000000174f 46400 1727204611.37471: WORKER PROCESS EXITING 46400 1727204611.37492: no more pending results, returning what we have 46400 1727204611.37497: in VariableManager get_vars() 46400 1727204611.37547: Calling all_inventory to load vars for managed-node2 46400 1727204611.37549: Calling groups_inventory to load vars for managed-node2 46400 1727204611.37553: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.37570: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.37573: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.37576: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.38541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.39453: done with get_vars() 46400 1727204611.39471: variable 'ansible_search_path' from source: unknown 46400 1727204611.39472: variable 'ansible_search_path' from source: unknown 46400 1727204611.39499: we have included files to process 46400 1727204611.39500: generating all_blocks data 46400 1727204611.39501: done generating all_blocks data 46400 1727204611.39504: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204611.39505: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204611.39506: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204611.39675: done processing included file 46400 1727204611.39676: iterating over new_blocks loaded from include file 46400 1727204611.39677: in VariableManager get_vars() 46400 1727204611.39691: done with get_vars() 46400 1727204611.39692: filtering new block on tags 46400 1727204611.39709: done filtering new block on tags 46400 1727204611.39710: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204611.39714: extending task lists for all hosts with included blocks 46400 1727204611.40505: done extending task lists 46400 1727204611.40506: done processing included files 46400 1727204611.40506: results queue empty 46400 1727204611.40507: checking for any_errors_fatal 46400 1727204611.40509: done checking for any_errors_fatal 46400 1727204611.40510: checking for max_fail_percentage 46400 1727204611.40511: done checking for max_fail_percentage 46400 1727204611.40511: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.40512: done checking to see if all hosts have failed 46400 1727204611.40512: getting the remaining hosts for this loop 46400 1727204611.40513: done getting the remaining hosts for this loop 46400 1727204611.40515: getting the next task for host managed-node2 46400 1727204611.40518: done getting next task for host managed-node2 46400 1727204611.40519: ^ task is: TASK: Cleanup profile and device 46400 1727204611.40522: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.40523: getting variables 46400 1727204611.40524: in VariableManager get_vars() 46400 1727204611.40534: Calling all_inventory to load vars for managed-node2 46400 1727204611.40536: Calling groups_inventory to load vars for managed-node2 46400 1727204611.40537: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.40543: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.40545: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.40547: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.41305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.42209: done with get_vars() 46400 1727204611.42227: done getting variables 46400 1727204611.42260: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.062) 0:01:41.707 ***** 46400 1727204611.42287: entering _queue_task() for managed-node2/shell 46400 1727204611.42545: worker is 1 (out of 1 available) 46400 1727204611.42558: exiting _queue_task() for managed-node2/shell 46400 1727204611.42574: done queuing things up, now waiting for results queue to drain 46400 1727204611.42575: waiting for pending results... 46400 1727204611.42767: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204611.42841: in run() - task 0affcd87-79f5-1303-fda8-00000000200b 46400 1727204611.42852: variable 'ansible_search_path' from source: unknown 46400 1727204611.42856: variable 'ansible_search_path' from source: unknown 46400 1727204611.42889: calling self._execute() 46400 1727204611.42969: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.42974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.42983: variable 'omit' from source: magic vars 46400 1727204611.43263: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.43277: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.43283: variable 'omit' from source: magic vars 46400 1727204611.43314: variable 'omit' from source: magic vars 46400 1727204611.43420: variable 'interface' from source: play vars 46400 1727204611.43436: variable 'omit' from source: magic vars 46400 1727204611.43478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204611.43506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204611.43525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204611.43539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.43548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.43580: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204611.43583: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.43586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.43652: Set connection var ansible_shell_type to sh 46400 1727204611.43660: Set connection var ansible_shell_executable to /bin/sh 46400 1727204611.43669: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204611.43676: Set connection var ansible_connection to ssh 46400 1727204611.43685: Set connection var ansible_pipelining to False 46400 1727204611.43690: Set connection var ansible_timeout to 10 46400 1727204611.43711: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.43714: variable 'ansible_connection' from source: unknown 46400 1727204611.43717: variable 'ansible_module_compression' from source: unknown 46400 1727204611.43719: variable 'ansible_shell_type' from source: unknown 46400 1727204611.43722: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.43724: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.43726: variable 'ansible_pipelining' from source: unknown 46400 1727204611.43729: variable 'ansible_timeout' from source: unknown 46400 1727204611.43732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.43876: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204611.43891: variable 'omit' from source: magic vars 46400 1727204611.43901: starting attempt loop 46400 1727204611.43907: running the handler 46400 1727204611.43922: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204611.43944: _low_level_execute_command(): starting 46400 1727204611.43956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204611.44698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.44722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.44726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.44834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.44839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.44842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.44914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.44931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.44953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.45027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.46652: stdout chunk (state=3): >>>/root <<< 46400 1727204611.46756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.46852: stderr chunk (state=3): >>><<< 46400 1727204611.46855: stdout chunk (state=3): >>><<< 46400 1727204611.46901: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.46907: _low_level_execute_command(): starting 46400 1727204611.46909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294 `" && echo ansible-tmp-1727204611.468805-53299-80979770611294="` echo /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294 `" ) && sleep 0' 46400 1727204611.47534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.47546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.47553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.47573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.47611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.47619: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204611.47628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.47642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204611.47651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204611.47657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204611.47673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.47682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.47694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.47703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.47709: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204611.47717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.47801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.47808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.47811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.48230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.49770: stdout chunk (state=3): >>>ansible-tmp-1727204611.468805-53299-80979770611294=/root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294 <<< 46400 1727204611.50170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.50174: stdout chunk (state=3): >>><<< 46400 1727204611.50176: stderr chunk (state=3): >>><<< 46400 1727204611.50180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204611.468805-53299-80979770611294=/root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.50182: variable 'ansible_module_compression' from source: unknown 46400 1727204611.50184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204611.50186: variable 'ansible_facts' from source: unknown 46400 1727204611.50303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/AnsiballZ_command.py 46400 1727204611.50371: Sending initial data 46400 1727204611.50374: Sent initial data (154 bytes) 46400 1727204611.51344: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.51362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.51382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.51401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.51445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.51458: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204611.51483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.51501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204611.51514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204611.51525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204611.51538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.51552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.51573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.51586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.51598: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204611.51613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.51694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.51718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.51735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.51806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.53544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204611.53583: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204611.53627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp80m0c0xd /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/AnsiballZ_command.py <<< 46400 1727204611.53665: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204611.54801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.55003: stderr chunk (state=3): >>><<< 46400 1727204611.55006: stdout chunk (state=3): >>><<< 46400 1727204611.55008: done transferring module to remote 46400 1727204611.55010: _low_level_execute_command(): starting 46400 1727204611.55013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/ /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/AnsiballZ_command.py && sleep 0' 46400 1727204611.55633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.55649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.55672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.55690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.55731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.55743: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204611.55755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.55782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204611.55793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204611.55803: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204611.55814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.55825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.55839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.55850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.55862: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204611.55883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.55961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.55991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.56007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.56075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.57962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.57967: stdout chunk (state=3): >>><<< 46400 1727204611.57970: stderr chunk (state=3): >>><<< 46400 1727204611.57977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.57980: _low_level_execute_command(): starting 46400 1727204611.57986: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/AnsiballZ_command.py && sleep 0' 46400 1727204611.58782: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.58790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.58800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.58814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.58853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.58859: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204611.58879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.58892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204611.58899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204611.58906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204611.58914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.58923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.58936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.58940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.58949: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204611.58956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.59037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.59054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.59072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.59143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.75854: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:31.723193", "end": "2024-09-24 15:03:31.757530", "delta": "0:00:00.034337", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204611.77242: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204611.77247: stdout chunk (state=3): >>><<< 46400 1727204611.77250: stderr chunk (state=3): >>><<< 46400 1727204611.77407: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:31.723193", "end": "2024-09-24 15:03:31.757530", "delta": "0:00:00.034337", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204611.77417: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204611.77419: _low_level_execute_command(): starting 46400 1727204611.77422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204611.468805-53299-80979770611294/ > /dev/null 2>&1 && sleep 0' 46400 1727204611.78091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204611.78107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.78123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.78142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.78192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.78205: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204611.78219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.78237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204611.78249: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204611.78271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204611.78286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204611.78300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204611.78316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204611.78327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204611.78338: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204611.78352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204611.78432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204611.78456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204611.78481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204611.78556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204611.80457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204611.80465: stdout chunk (state=3): >>><<< 46400 1727204611.80476: stderr chunk (state=3): >>><<< 46400 1727204611.80772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204611.80775: handler run complete 46400 1727204611.80777: Evaluated conditional (False): False 46400 1727204611.80780: attempt loop complete, returning result 46400 1727204611.80781: _execute() done 46400 1727204611.80783: dumping result to json 46400 1727204611.80785: done dumping result, returning 46400 1727204611.80787: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-00000000200b] 46400 1727204611.80789: sending task result for task 0affcd87-79f5-1303-fda8-00000000200b fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.034337", "end": "2024-09-24 15:03:31.757530", "rc": 1, "start": "2024-09-24 15:03:31.723193" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204611.80940: no more pending results, returning what we have 46400 1727204611.80944: results queue empty 46400 1727204611.80946: checking for any_errors_fatal 46400 1727204611.80947: done checking for any_errors_fatal 46400 1727204611.80948: checking for max_fail_percentage 46400 1727204611.80950: done checking for max_fail_percentage 46400 1727204611.80951: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.80952: done checking to see if all hosts have failed 46400 1727204611.80952: getting the remaining hosts for this loop 46400 1727204611.80954: done getting the remaining hosts for this loop 46400 1727204611.80958: getting the next task for host managed-node2 46400 1727204611.80972: done getting next task for host managed-node2 46400 1727204611.80977: ^ task is: TASK: Include the task 'run_test.yml' 46400 1727204611.80979: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.80984: getting variables 46400 1727204611.80985: in VariableManager get_vars() 46400 1727204611.81034: Calling all_inventory to load vars for managed-node2 46400 1727204611.81037: Calling groups_inventory to load vars for managed-node2 46400 1727204611.81041: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.81053: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.81056: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.81059: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.81682: done sending task result for task 0affcd87-79f5-1303-fda8-00000000200b 46400 1727204611.81686: WORKER PROCESS EXITING 46400 1727204611.82882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.85151: done with get_vars() 46400 1727204611.85193: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.430) 0:01:42.137 ***** 46400 1727204611.85311: entering _queue_task() for managed-node2/include_tasks 46400 1727204611.85678: worker is 1 (out of 1 available) 46400 1727204611.85690: exiting _queue_task() for managed-node2/include_tasks 46400 1727204611.85703: done queuing things up, now waiting for results queue to drain 46400 1727204611.85704: waiting for pending results... 46400 1727204611.85998: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 46400 1727204611.86096: in run() - task 0affcd87-79f5-1303-fda8-000000000017 46400 1727204611.86116: variable 'ansible_search_path' from source: unknown 46400 1727204611.86169: calling self._execute() 46400 1727204611.86277: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.86289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.86303: variable 'omit' from source: magic vars 46400 1727204611.86701: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.86720: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.86731: _execute() done 46400 1727204611.86739: dumping result to json 46400 1727204611.86746: done dumping result, returning 46400 1727204611.86755: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1303-fda8-000000000017] 46400 1727204611.86769: sending task result for task 0affcd87-79f5-1303-fda8-000000000017 46400 1727204611.86928: no more pending results, returning what we have 46400 1727204611.86934: in VariableManager get_vars() 46400 1727204611.86992: Calling all_inventory to load vars for managed-node2 46400 1727204611.86996: Calling groups_inventory to load vars for managed-node2 46400 1727204611.87000: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.87016: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.87019: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.87023: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.88085: done sending task result for task 0affcd87-79f5-1303-fda8-000000000017 46400 1727204611.88089: WORKER PROCESS EXITING 46400 1727204611.89056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.90777: done with get_vars() 46400 1727204611.90799: variable 'ansible_search_path' from source: unknown 46400 1727204611.90814: we have included files to process 46400 1727204611.90816: generating all_blocks data 46400 1727204611.90817: done generating all_blocks data 46400 1727204611.90822: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204611.90823: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204611.90825: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 46400 1727204611.91331: in VariableManager get_vars() 46400 1727204611.91352: done with get_vars() 46400 1727204611.91399: in VariableManager get_vars() 46400 1727204611.91420: done with get_vars() 46400 1727204611.91463: in VariableManager get_vars() 46400 1727204611.91483: done with get_vars() 46400 1727204611.91525: in VariableManager get_vars() 46400 1727204611.91542: done with get_vars() 46400 1727204611.91587: in VariableManager get_vars() 46400 1727204611.91605: done with get_vars() 46400 1727204611.92006: in VariableManager get_vars() 46400 1727204611.92023: done with get_vars() 46400 1727204611.92035: done processing included file 46400 1727204611.92037: iterating over new_blocks loaded from include file 46400 1727204611.92039: in VariableManager get_vars() 46400 1727204611.92051: done with get_vars() 46400 1727204611.92053: filtering new block on tags 46400 1727204611.92158: done filtering new block on tags 46400 1727204611.92163: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 46400 1727204611.92170: extending task lists for all hosts with included blocks 46400 1727204611.92207: done extending task lists 46400 1727204611.92208: done processing included files 46400 1727204611.92209: results queue empty 46400 1727204611.92210: checking for any_errors_fatal 46400 1727204611.92215: done checking for any_errors_fatal 46400 1727204611.92215: checking for max_fail_percentage 46400 1727204611.92216: done checking for max_fail_percentage 46400 1727204611.92217: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.92218: done checking to see if all hosts have failed 46400 1727204611.92219: getting the remaining hosts for this loop 46400 1727204611.92220: done getting the remaining hosts for this loop 46400 1727204611.92222: getting the next task for host managed-node2 46400 1727204611.92226: done getting next task for host managed-node2 46400 1727204611.92228: ^ task is: TASK: TEST: {{ lsr_description }} 46400 1727204611.92230: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.92233: getting variables 46400 1727204611.92234: in VariableManager get_vars() 46400 1727204611.92244: Calling all_inventory to load vars for managed-node2 46400 1727204611.92246: Calling groups_inventory to load vars for managed-node2 46400 1727204611.92248: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.92254: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.92256: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.92261: Calling groups_plugins_play to load vars for managed-node2 46400 1727204611.93624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204611.95508: done with get_vars() 46400 1727204611.95530: done getting variables 46400 1727204611.95583: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204611.95707: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.104) 0:01:42.241 ***** 46400 1727204611.95737: entering _queue_task() for managed-node2/debug 46400 1727204611.96095: worker is 1 (out of 1 available) 46400 1727204611.96108: exiting _queue_task() for managed-node2/debug 46400 1727204611.96121: done queuing things up, now waiting for results queue to drain 46400 1727204611.96123: waiting for pending results... 46400 1727204611.96424: running TaskExecutor() for managed-node2/TASK: TEST: I will not get an error when I try to remove an absent profile 46400 1727204611.96537: in run() - task 0affcd87-79f5-1303-fda8-0000000020ad 46400 1727204611.96564: variable 'ansible_search_path' from source: unknown 46400 1727204611.96577: variable 'ansible_search_path' from source: unknown 46400 1727204611.96618: calling self._execute() 46400 1727204611.96727: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.96740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.96757: variable 'omit' from source: magic vars 46400 1727204611.97143: variable 'ansible_distribution_major_version' from source: facts 46400 1727204611.97166: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204611.97178: variable 'omit' from source: magic vars 46400 1727204611.97250: variable 'omit' from source: magic vars 46400 1727204611.97367: variable 'lsr_description' from source: include params 46400 1727204611.97394: variable 'omit' from source: magic vars 46400 1727204611.97492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204611.97592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204611.97689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204611.97711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.97782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204611.97816: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204611.97826: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.97833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.97985: Set connection var ansible_shell_type to sh 46400 1727204611.98113: Set connection var ansible_shell_executable to /bin/sh 46400 1727204611.98124: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204611.98135: Set connection var ansible_connection to ssh 46400 1727204611.98145: Set connection var ansible_pipelining to False 46400 1727204611.98157: Set connection var ansible_timeout to 10 46400 1727204611.98234: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.98244: variable 'ansible_connection' from source: unknown 46400 1727204611.98270: variable 'ansible_module_compression' from source: unknown 46400 1727204611.98279: variable 'ansible_shell_type' from source: unknown 46400 1727204611.98293: variable 'ansible_shell_executable' from source: unknown 46400 1727204611.98296: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204611.98299: variable 'ansible_pipelining' from source: unknown 46400 1727204611.98301: variable 'ansible_timeout' from source: unknown 46400 1727204611.98306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204611.98467: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204611.98478: variable 'omit' from source: magic vars 46400 1727204611.98483: starting attempt loop 46400 1727204611.98509: running the handler 46400 1727204611.98550: handler run complete 46400 1727204611.98566: attempt loop complete, returning result 46400 1727204611.98569: _execute() done 46400 1727204611.98572: dumping result to json 46400 1727204611.98574: done dumping result, returning 46400 1727204611.98581: done running TaskExecutor() for managed-node2/TASK: TEST: I will not get an error when I try to remove an absent profile [0affcd87-79f5-1303-fda8-0000000020ad] 46400 1727204611.98587: sending task result for task 0affcd87-79f5-1303-fda8-0000000020ad 46400 1727204611.98691: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020ad 46400 1727204611.98694: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 46400 1727204611.98739: no more pending results, returning what we have 46400 1727204611.98744: results queue empty 46400 1727204611.98745: checking for any_errors_fatal 46400 1727204611.98746: done checking for any_errors_fatal 46400 1727204611.98747: checking for max_fail_percentage 46400 1727204611.98748: done checking for max_fail_percentage 46400 1727204611.98749: checking to see if all hosts have failed and the running result is not ok 46400 1727204611.98750: done checking to see if all hosts have failed 46400 1727204611.98751: getting the remaining hosts for this loop 46400 1727204611.98752: done getting the remaining hosts for this loop 46400 1727204611.98756: getting the next task for host managed-node2 46400 1727204611.98770: done getting next task for host managed-node2 46400 1727204611.98773: ^ task is: TASK: Show item 46400 1727204611.98776: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204611.98780: getting variables 46400 1727204611.98782: in VariableManager get_vars() 46400 1727204611.98827: Calling all_inventory to load vars for managed-node2 46400 1727204611.98830: Calling groups_inventory to load vars for managed-node2 46400 1727204611.98834: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204611.98845: Calling all_plugins_play to load vars for managed-node2 46400 1727204611.98847: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204611.98849: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.00287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.02958: done with get_vars() 46400 1727204612.02997: done getting variables 46400 1727204612.03063: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.073) 0:01:42.315 ***** 46400 1727204612.03099: entering _queue_task() for managed-node2/debug 46400 1727204612.03456: worker is 1 (out of 1 available) 46400 1727204612.03475: exiting _queue_task() for managed-node2/debug 46400 1727204612.03488: done queuing things up, now waiting for results queue to drain 46400 1727204612.03490: waiting for pending results... 46400 1727204612.03785: running TaskExecutor() for managed-node2/TASK: Show item 46400 1727204612.03902: in run() - task 0affcd87-79f5-1303-fda8-0000000020ae 46400 1727204612.03922: variable 'ansible_search_path' from source: unknown 46400 1727204612.03936: variable 'ansible_search_path' from source: unknown 46400 1727204612.04001: variable 'omit' from source: magic vars 46400 1727204612.04163: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.04197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.04215: variable 'omit' from source: magic vars 46400 1727204612.04717: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.04740: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.04753: variable 'omit' from source: magic vars 46400 1727204612.04800: variable 'omit' from source: magic vars 46400 1727204612.04854: variable 'item' from source: unknown 46400 1727204612.04929: variable 'item' from source: unknown 46400 1727204612.04956: variable 'omit' from source: magic vars 46400 1727204612.05010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204612.05054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.05087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204612.05112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.05131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.05175: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.05186: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.05195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.05307: Set connection var ansible_shell_type to sh 46400 1727204612.05322: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.05331: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.05339: Set connection var ansible_connection to ssh 46400 1727204612.05347: Set connection var ansible_pipelining to False 46400 1727204612.05355: Set connection var ansible_timeout to 10 46400 1727204612.05389: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.05397: variable 'ansible_connection' from source: unknown 46400 1727204612.05403: variable 'ansible_module_compression' from source: unknown 46400 1727204612.05409: variable 'ansible_shell_type' from source: unknown 46400 1727204612.05416: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.05494: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.05502: variable 'ansible_pipelining' from source: unknown 46400 1727204612.05509: variable 'ansible_timeout' from source: unknown 46400 1727204612.05517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.05776: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.05792: variable 'omit' from source: magic vars 46400 1727204612.05820: starting attempt loop 46400 1727204612.05826: running the handler 46400 1727204612.05879: variable 'lsr_description' from source: include params 46400 1727204612.06105: variable 'lsr_description' from source: include params 46400 1727204612.06119: handler run complete 46400 1727204612.06255: attempt loop complete, returning result 46400 1727204612.06281: variable 'item' from source: unknown 46400 1727204612.06347: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 46400 1727204612.06652: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.06674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.06689: variable 'omit' from source: magic vars 46400 1727204612.06865: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.06876: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.06884: variable 'omit' from source: magic vars 46400 1727204612.06902: variable 'omit' from source: magic vars 46400 1727204612.06944: variable 'item' from source: unknown 46400 1727204612.07019: variable 'item' from source: unknown 46400 1727204612.07039: variable 'omit' from source: magic vars 46400 1727204612.07068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.07084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.07095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.07112: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.07120: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.07127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.07209: Set connection var ansible_shell_type to sh 46400 1727204612.07222: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.07234: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.07245: Set connection var ansible_connection to ssh 46400 1727204612.07254: Set connection var ansible_pipelining to False 46400 1727204612.07269: Set connection var ansible_timeout to 10 46400 1727204612.07300: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.07309: variable 'ansible_connection' from source: unknown 46400 1727204612.07317: variable 'ansible_module_compression' from source: unknown 46400 1727204612.07324: variable 'ansible_shell_type' from source: unknown 46400 1727204612.07330: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.07336: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.07342: variable 'ansible_pipelining' from source: unknown 46400 1727204612.07347: variable 'ansible_timeout' from source: unknown 46400 1727204612.07354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.07449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.07469: variable 'omit' from source: magic vars 46400 1727204612.07478: starting attempt loop 46400 1727204612.07484: running the handler 46400 1727204612.07513: variable 'lsr_setup' from source: include params 46400 1727204612.07589: variable 'lsr_setup' from source: include params 46400 1727204612.07650: handler run complete 46400 1727204612.07675: attempt loop complete, returning result 46400 1727204612.07695: variable 'item' from source: unknown 46400 1727204612.07773: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 46400 1727204612.07936: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.07948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.07962: variable 'omit' from source: magic vars 46400 1727204612.08114: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.08122: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.08129: variable 'omit' from source: magic vars 46400 1727204612.08145: variable 'omit' from source: magic vars 46400 1727204612.08191: variable 'item' from source: unknown 46400 1727204612.08258: variable 'item' from source: unknown 46400 1727204612.08282: variable 'omit' from source: magic vars 46400 1727204612.08306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.08323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.08335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.08350: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.08358: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.08371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.08441: Set connection var ansible_shell_type to sh 46400 1727204612.08453: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.08463: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.08476: Set connection var ansible_connection to ssh 46400 1727204612.08485: Set connection var ansible_pipelining to False 46400 1727204612.08495: Set connection var ansible_timeout to 10 46400 1727204612.08520: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.08530: variable 'ansible_connection' from source: unknown 46400 1727204612.08540: variable 'ansible_module_compression' from source: unknown 46400 1727204612.08547: variable 'ansible_shell_type' from source: unknown 46400 1727204612.08554: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.08566: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.08575: variable 'ansible_pipelining' from source: unknown 46400 1727204612.08582: variable 'ansible_timeout' from source: unknown 46400 1727204612.08588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.08707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.08720: variable 'omit' from source: magic vars 46400 1727204612.08729: starting attempt loop 46400 1727204612.08735: running the handler 46400 1727204612.08769: variable 'lsr_test' from source: include params 46400 1727204612.08883: variable 'lsr_test' from source: include params 46400 1727204612.08906: handler run complete 46400 1727204612.08924: attempt loop complete, returning result 46400 1727204612.08944: variable 'item' from source: unknown 46400 1727204612.09031: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 46400 1727204612.09225: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.09254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.09273: variable 'omit' from source: magic vars 46400 1727204612.09465: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.09479: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.09487: variable 'omit' from source: magic vars 46400 1727204612.09505: variable 'omit' from source: magic vars 46400 1727204612.09549: variable 'item' from source: unknown 46400 1727204612.09621: variable 'item' from source: unknown 46400 1727204612.09642: variable 'omit' from source: magic vars 46400 1727204612.09672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.09689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.09700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.09716: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.09724: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.09730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.09817: Set connection var ansible_shell_type to sh 46400 1727204612.09837: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.09847: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.09857: Set connection var ansible_connection to ssh 46400 1727204612.09872: Set connection var ansible_pipelining to False 46400 1727204612.09882: Set connection var ansible_timeout to 10 46400 1727204612.09910: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.09918: variable 'ansible_connection' from source: unknown 46400 1727204612.09925: variable 'ansible_module_compression' from source: unknown 46400 1727204612.09931: variable 'ansible_shell_type' from source: unknown 46400 1727204612.09938: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.09945: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.09952: variable 'ansible_pipelining' from source: unknown 46400 1727204612.09962: variable 'ansible_timeout' from source: unknown 46400 1727204612.09973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.10209: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.10227: variable 'omit' from source: magic vars 46400 1727204612.10236: starting attempt loop 46400 1727204612.10242: running the handler 46400 1727204612.10272: variable 'lsr_assert' from source: include params 46400 1727204612.10344: variable 'lsr_assert' from source: include params 46400 1727204612.10376: handler run complete 46400 1727204612.10396: attempt loop complete, returning result 46400 1727204612.10415: variable 'item' from source: unknown 46400 1727204612.10488: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 46400 1727204612.12170: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.12189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.12203: variable 'omit' from source: magic vars 46400 1727204612.12448: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.12470: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.12478: variable 'omit' from source: magic vars 46400 1727204612.12497: variable 'omit' from source: magic vars 46400 1727204612.12543: variable 'item' from source: unknown 46400 1727204612.13133: variable 'item' from source: unknown 46400 1727204612.13152: variable 'omit' from source: magic vars 46400 1727204612.13182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.13199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.13209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.13227: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.13236: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.13244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.13323: Set connection var ansible_shell_type to sh 46400 1727204612.13338: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.13352: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.13366: Set connection var ansible_connection to ssh 46400 1727204612.13376: Set connection var ansible_pipelining to False 46400 1727204612.13385: Set connection var ansible_timeout to 10 46400 1727204612.13409: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.13415: variable 'ansible_connection' from source: unknown 46400 1727204612.13427: variable 'ansible_module_compression' from source: unknown 46400 1727204612.13594: variable 'ansible_shell_type' from source: unknown 46400 1727204612.13602: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.13608: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.13620: variable 'ansible_pipelining' from source: unknown 46400 1727204612.13627: variable 'ansible_timeout' from source: unknown 46400 1727204612.13634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.13814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.13829: variable 'omit' from source: magic vars 46400 1727204612.13837: starting attempt loop 46400 1727204612.13842: running the handler 46400 1727204612.13869: variable 'lsr_assert_when' from source: include params 46400 1727204612.14012: variable 'lsr_assert_when' from source: include params 46400 1727204612.14114: variable 'network_provider' from source: set_fact 46400 1727204612.14152: handler run complete 46400 1727204612.14176: attempt loop complete, returning result 46400 1727204612.14197: variable 'item' from source: unknown 46400 1727204612.14268: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 46400 1727204612.14438: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.14452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.14470: variable 'omit' from source: magic vars 46400 1727204612.14644: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.14654: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.14667: variable 'omit' from source: magic vars 46400 1727204612.14684: variable 'omit' from source: magic vars 46400 1727204612.14730: variable 'item' from source: unknown 46400 1727204612.14800: variable 'item' from source: unknown 46400 1727204612.14820: variable 'omit' from source: magic vars 46400 1727204612.14843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.14854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.14868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.14885: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.14892: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.14898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.14980: Set connection var ansible_shell_type to sh 46400 1727204612.14993: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.15001: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.15009: Set connection var ansible_connection to ssh 46400 1727204612.15017: Set connection var ansible_pipelining to False 46400 1727204612.15031: Set connection var ansible_timeout to 10 46400 1727204612.15062: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.15074: variable 'ansible_connection' from source: unknown 46400 1727204612.15081: variable 'ansible_module_compression' from source: unknown 46400 1727204612.15088: variable 'ansible_shell_type' from source: unknown 46400 1727204612.15094: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.15101: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.15107: variable 'ansible_pipelining' from source: unknown 46400 1727204612.15113: variable 'ansible_timeout' from source: unknown 46400 1727204612.15119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.15212: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.15224: variable 'omit' from source: magic vars 46400 1727204612.15232: starting attempt loop 46400 1727204612.15241: running the handler 46400 1727204612.15272: variable 'lsr_fail_debug' from source: play vars 46400 1727204612.15335: variable 'lsr_fail_debug' from source: play vars 46400 1727204612.15366: handler run complete 46400 1727204612.15385: attempt loop complete, returning result 46400 1727204612.15405: variable 'item' from source: unknown 46400 1727204612.15473: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 46400 1727204612.15634: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.15696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.15711: variable 'omit' from source: magic vars 46400 1727204612.15955: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.15962: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.15967: variable 'omit' from source: magic vars 46400 1727204612.15981: variable 'omit' from source: magic vars 46400 1727204612.16051: variable 'item' from source: unknown 46400 1727204612.16139: variable 'item' from source: unknown 46400 1727204612.16142: variable 'omit' from source: magic vars 46400 1727204612.16158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.16169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.16172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.16184: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.16186: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.16189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.16270: Set connection var ansible_shell_type to sh 46400 1727204612.16279: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.16284: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.16289: Set connection var ansible_connection to ssh 46400 1727204612.16294: Set connection var ansible_pipelining to False 46400 1727204612.16299: Set connection var ansible_timeout to 10 46400 1727204612.16320: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.16327: variable 'ansible_connection' from source: unknown 46400 1727204612.16330: variable 'ansible_module_compression' from source: unknown 46400 1727204612.16333: variable 'ansible_shell_type' from source: unknown 46400 1727204612.16335: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.16337: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.16342: variable 'ansible_pipelining' from source: unknown 46400 1727204612.16344: variable 'ansible_timeout' from source: unknown 46400 1727204612.16348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.16450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.16457: variable 'omit' from source: magic vars 46400 1727204612.16462: starting attempt loop 46400 1727204612.16469: running the handler 46400 1727204612.16490: variable 'lsr_cleanup' from source: include params 46400 1727204612.16557: variable 'lsr_cleanup' from source: include params 46400 1727204612.16577: handler run complete 46400 1727204612.16594: attempt loop complete, returning result 46400 1727204612.16610: variable 'item' from source: unknown 46400 1727204612.16672: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 46400 1727204612.16762: dumping result to json 46400 1727204612.16767: done dumping result, returning 46400 1727204612.16769: done running TaskExecutor() for managed-node2/TASK: Show item [0affcd87-79f5-1303-fda8-0000000020ae] 46400 1727204612.16771: sending task result for task 0affcd87-79f5-1303-fda8-0000000020ae 46400 1727204612.16821: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020ae 46400 1727204612.16824: WORKER PROCESS EXITING 46400 1727204612.16874: no more pending results, returning what we have 46400 1727204612.16878: results queue empty 46400 1727204612.16879: checking for any_errors_fatal 46400 1727204612.16886: done checking for any_errors_fatal 46400 1727204612.16887: checking for max_fail_percentage 46400 1727204612.16889: done checking for max_fail_percentage 46400 1727204612.16890: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.16890: done checking to see if all hosts have failed 46400 1727204612.16891: getting the remaining hosts for this loop 46400 1727204612.16893: done getting the remaining hosts for this loop 46400 1727204612.16897: getting the next task for host managed-node2 46400 1727204612.16904: done getting next task for host managed-node2 46400 1727204612.16907: ^ task is: TASK: Include the task 'show_interfaces.yml' 46400 1727204612.16910: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.16913: getting variables 46400 1727204612.16915: in VariableManager get_vars() 46400 1727204612.16958: Calling all_inventory to load vars for managed-node2 46400 1727204612.16963: Calling groups_inventory to load vars for managed-node2 46400 1727204612.16969: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.16980: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.16983: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.16986: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.19527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.21406: done with get_vars() 46400 1727204612.21442: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.184) 0:01:42.500 ***** 46400 1727204612.21545: entering _queue_task() for managed-node2/include_tasks 46400 1727204612.21916: worker is 1 (out of 1 available) 46400 1727204612.21929: exiting _queue_task() for managed-node2/include_tasks 46400 1727204612.21942: done queuing things up, now waiting for results queue to drain 46400 1727204612.21944: waiting for pending results... 46400 1727204612.22241: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 46400 1727204612.22361: in run() - task 0affcd87-79f5-1303-fda8-0000000020af 46400 1727204612.22392: variable 'ansible_search_path' from source: unknown 46400 1727204612.22401: variable 'ansible_search_path' from source: unknown 46400 1727204612.22445: calling self._execute() 46400 1727204612.22548: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.22557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.22571: variable 'omit' from source: magic vars 46400 1727204612.22974: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.22993: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.23007: _execute() done 46400 1727204612.23015: dumping result to json 46400 1727204612.23023: done dumping result, returning 46400 1727204612.23032: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1303-fda8-0000000020af] 46400 1727204612.23048: sending task result for task 0affcd87-79f5-1303-fda8-0000000020af 46400 1727204612.23190: no more pending results, returning what we have 46400 1727204612.23196: in VariableManager get_vars() 46400 1727204612.23256: Calling all_inventory to load vars for managed-node2 46400 1727204612.23260: Calling groups_inventory to load vars for managed-node2 46400 1727204612.23269: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.23285: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.23289: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.23293: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.24331: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020af 46400 1727204612.24334: WORKER PROCESS EXITING 46400 1727204612.25197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.26933: done with get_vars() 46400 1727204612.26962: variable 'ansible_search_path' from source: unknown 46400 1727204612.26965: variable 'ansible_search_path' from source: unknown 46400 1727204612.27006: we have included files to process 46400 1727204612.27007: generating all_blocks data 46400 1727204612.27010: done generating all_blocks data 46400 1727204612.27014: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204612.27015: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204612.27017: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 46400 1727204612.27129: in VariableManager get_vars() 46400 1727204612.27152: done with get_vars() 46400 1727204612.27270: done processing included file 46400 1727204612.27272: iterating over new_blocks loaded from include file 46400 1727204612.27273: in VariableManager get_vars() 46400 1727204612.27294: done with get_vars() 46400 1727204612.27296: filtering new block on tags 46400 1727204612.27331: done filtering new block on tags 46400 1727204612.27334: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 46400 1727204612.27339: extending task lists for all hosts with included blocks 46400 1727204612.27819: done extending task lists 46400 1727204612.27821: done processing included files 46400 1727204612.27822: results queue empty 46400 1727204612.27822: checking for any_errors_fatal 46400 1727204612.27830: done checking for any_errors_fatal 46400 1727204612.27830: checking for max_fail_percentage 46400 1727204612.27832: done checking for max_fail_percentage 46400 1727204612.27833: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.27833: done checking to see if all hosts have failed 46400 1727204612.27834: getting the remaining hosts for this loop 46400 1727204612.27835: done getting the remaining hosts for this loop 46400 1727204612.27838: getting the next task for host managed-node2 46400 1727204612.27842: done getting next task for host managed-node2 46400 1727204612.27844: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 46400 1727204612.27847: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.27850: getting variables 46400 1727204612.27851: in VariableManager get_vars() 46400 1727204612.27862: Calling all_inventory to load vars for managed-node2 46400 1727204612.27866: Calling groups_inventory to load vars for managed-node2 46400 1727204612.27869: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.27875: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.27877: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.27880: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.34813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.36523: done with get_vars() 46400 1727204612.36556: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.150) 0:01:42.650 ***** 46400 1727204612.36643: entering _queue_task() for managed-node2/include_tasks 46400 1727204612.37011: worker is 1 (out of 1 available) 46400 1727204612.37024: exiting _queue_task() for managed-node2/include_tasks 46400 1727204612.37037: done queuing things up, now waiting for results queue to drain 46400 1727204612.37039: waiting for pending results... 46400 1727204612.37337: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 46400 1727204612.37473: in run() - task 0affcd87-79f5-1303-fda8-0000000020d6 46400 1727204612.37500: variable 'ansible_search_path' from source: unknown 46400 1727204612.37508: variable 'ansible_search_path' from source: unknown 46400 1727204612.37548: calling self._execute() 46400 1727204612.37658: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.37674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.37691: variable 'omit' from source: magic vars 46400 1727204612.38098: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.38115: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.38125: _execute() done 46400 1727204612.38137: dumping result to json 46400 1727204612.38149: done dumping result, returning 46400 1727204612.38161: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1303-fda8-0000000020d6] 46400 1727204612.38174: sending task result for task 0affcd87-79f5-1303-fda8-0000000020d6 46400 1727204612.38306: no more pending results, returning what we have 46400 1727204612.38312: in VariableManager get_vars() 46400 1727204612.38371: Calling all_inventory to load vars for managed-node2 46400 1727204612.38376: Calling groups_inventory to load vars for managed-node2 46400 1727204612.38381: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.38395: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.38399: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.38402: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.39483: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020d6 46400 1727204612.39487: WORKER PROCESS EXITING 46400 1727204612.40143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.41920: done with get_vars() 46400 1727204612.41943: variable 'ansible_search_path' from source: unknown 46400 1727204612.41944: variable 'ansible_search_path' from source: unknown 46400 1727204612.41986: we have included files to process 46400 1727204612.41988: generating all_blocks data 46400 1727204612.41991: done generating all_blocks data 46400 1727204612.41992: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204612.41993: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204612.41995: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 46400 1727204612.42277: done processing included file 46400 1727204612.42280: iterating over new_blocks loaded from include file 46400 1727204612.42281: in VariableManager get_vars() 46400 1727204612.42302: done with get_vars() 46400 1727204612.42304: filtering new block on tags 46400 1727204612.42345: done filtering new block on tags 46400 1727204612.42348: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 46400 1727204612.42353: extending task lists for all hosts with included blocks 46400 1727204612.42537: done extending task lists 46400 1727204612.42538: done processing included files 46400 1727204612.42539: results queue empty 46400 1727204612.42540: checking for any_errors_fatal 46400 1727204612.42544: done checking for any_errors_fatal 46400 1727204612.42545: checking for max_fail_percentage 46400 1727204612.42546: done checking for max_fail_percentage 46400 1727204612.42547: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.42547: done checking to see if all hosts have failed 46400 1727204612.42548: getting the remaining hosts for this loop 46400 1727204612.42550: done getting the remaining hosts for this loop 46400 1727204612.42552: getting the next task for host managed-node2 46400 1727204612.42559: done getting next task for host managed-node2 46400 1727204612.42562: ^ task is: TASK: Gather current interface info 46400 1727204612.42568: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.42570: getting variables 46400 1727204612.42571: in VariableManager get_vars() 46400 1727204612.42583: Calling all_inventory to load vars for managed-node2 46400 1727204612.42586: Calling groups_inventory to load vars for managed-node2 46400 1727204612.42588: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.42593: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.42596: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.42599: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.43818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.44908: done with get_vars() 46400 1727204612.44928: done getting variables 46400 1727204612.44965: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.083) 0:01:42.734 ***** 46400 1727204612.44991: entering _queue_task() for managed-node2/command 46400 1727204612.45247: worker is 1 (out of 1 available) 46400 1727204612.45262: exiting _queue_task() for managed-node2/command 46400 1727204612.45276: done queuing things up, now waiting for results queue to drain 46400 1727204612.45278: waiting for pending results... 46400 1727204612.45471: running TaskExecutor() for managed-node2/TASK: Gather current interface info 46400 1727204612.45568: in run() - task 0affcd87-79f5-1303-fda8-000000002111 46400 1727204612.45581: variable 'ansible_search_path' from source: unknown 46400 1727204612.45586: variable 'ansible_search_path' from source: unknown 46400 1727204612.45614: calling self._execute() 46400 1727204612.45697: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.45702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.45714: variable 'omit' from source: magic vars 46400 1727204612.46302: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.46305: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.46308: variable 'omit' from source: magic vars 46400 1727204612.46310: variable 'omit' from source: magic vars 46400 1727204612.46312: variable 'omit' from source: magic vars 46400 1727204612.46314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204612.46317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.46319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204612.46322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.46324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.46326: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.46328: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.46330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.46397: Set connection var ansible_shell_type to sh 46400 1727204612.46401: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.46406: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.46411: Set connection var ansible_connection to ssh 46400 1727204612.46417: Set connection var ansible_pipelining to False 46400 1727204612.46422: Set connection var ansible_timeout to 10 46400 1727204612.46447: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.46450: variable 'ansible_connection' from source: unknown 46400 1727204612.46453: variable 'ansible_module_compression' from source: unknown 46400 1727204612.46456: variable 'ansible_shell_type' from source: unknown 46400 1727204612.46458: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.46463: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.46468: variable 'ansible_pipelining' from source: unknown 46400 1727204612.46470: variable 'ansible_timeout' from source: unknown 46400 1727204612.46473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.46612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.46621: variable 'omit' from source: magic vars 46400 1727204612.46631: starting attempt loop 46400 1727204612.46636: running the handler 46400 1727204612.46654: _low_level_execute_command(): starting 46400 1727204612.46669: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204612.47355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204612.47378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.47393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.47411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.47452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.47466: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204612.47482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.47502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204612.47535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204612.47541: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.47546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204612.47550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.47609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204612.47615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.47629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.47691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.49330: stdout chunk (state=3): >>>/root <<< 46400 1727204612.49448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204612.49509: stderr chunk (state=3): >>><<< 46400 1727204612.49542: stdout chunk (state=3): >>><<< 46400 1727204612.49660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204612.49665: _low_level_execute_command(): starting 46400 1727204612.49668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012 `" && echo ansible-tmp-1727204612.4956708-53345-37632266602012="` echo /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012 `" ) && sleep 0' 46400 1727204612.50259: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204612.50276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.50299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.50319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.50362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.50381: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204612.50394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.50410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204612.50421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204612.50436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204612.50450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.50463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.50481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.50493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.50504: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204612.50517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.50598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204612.50621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.50640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.50719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.52589: stdout chunk (state=3): >>>ansible-tmp-1727204612.4956708-53345-37632266602012=/root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012 <<< 46400 1727204612.52703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204612.52770: stderr chunk (state=3): >>><<< 46400 1727204612.52773: stdout chunk (state=3): >>><<< 46400 1727204612.52789: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.4956708-53345-37632266602012=/root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204612.52818: variable 'ansible_module_compression' from source: unknown 46400 1727204612.52867: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204612.52921: variable 'ansible_facts' from source: unknown 46400 1727204612.52993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/AnsiballZ_command.py 46400 1727204612.53448: Sending initial data 46400 1727204612.53452: Sent initial data (155 bytes) 46400 1727204612.54102: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.54106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.54130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.54140: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204612.54146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.54178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204612.54181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204612.54183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.54257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204612.54262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.54267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.54299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.56016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204612.56048: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204612.56084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpm1423uss /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/AnsiballZ_command.py <<< 46400 1727204612.56120: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204612.56910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204612.57085: stderr chunk (state=3): >>><<< 46400 1727204612.57088: stdout chunk (state=3): >>><<< 46400 1727204612.57112: done transferring module to remote 46400 1727204612.57123: _low_level_execute_command(): starting 46400 1727204612.57128: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/ /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/AnsiballZ_command.py && sleep 0' 46400 1727204612.57913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.57917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.57986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.57995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.58007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.58013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.58060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204612.58085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.58088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.58135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.59839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204612.59895: stderr chunk (state=3): >>><<< 46400 1727204612.59898: stdout chunk (state=3): >>><<< 46400 1727204612.59913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204612.59917: _low_level_execute_command(): starting 46400 1727204612.59922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/AnsiballZ_command.py && sleep 0' 46400 1727204612.60388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.60392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.60425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.60429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.60431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.60482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.60489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.60553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.74044: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:32.736165", "end": "2024-09-24 15:03:32.739385", "delta": "0:00:00.003220", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204612.75238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204612.75242: stderr chunk (state=3): >>><<< 46400 1727204612.75245: stdout chunk (state=3): >>><<< 46400 1727204612.75271: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:32.736165", "end": "2024-09-24 15:03:32.739385", "delta": "0:00:00.003220", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204612.75313: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204612.75319: _low_level_execute_command(): starting 46400 1727204612.75324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.4956708-53345-37632266602012/ > /dev/null 2>&1 && sleep 0' 46400 1727204612.76476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204612.76486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.76494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.76509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.76547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.76553: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204612.76566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.76578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204612.76586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204612.76593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204612.76602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204612.76610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204612.76621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204612.76628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204612.76634: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204612.76643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204612.76719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204612.76726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204612.76733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204612.77441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204612.79327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204612.79331: stdout chunk (state=3): >>><<< 46400 1727204612.79338: stderr chunk (state=3): >>><<< 46400 1727204612.79355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204612.79376: handler run complete 46400 1727204612.79389: Evaluated conditional (False): False 46400 1727204612.79401: attempt loop complete, returning result 46400 1727204612.79404: _execute() done 46400 1727204612.79406: dumping result to json 46400 1727204612.79412: done dumping result, returning 46400 1727204612.79420: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-1303-fda8-000000002111] 46400 1727204612.79427: sending task result for task 0affcd87-79f5-1303-fda8-000000002111 46400 1727204612.79535: done sending task result for task 0affcd87-79f5-1303-fda8-000000002111 46400 1727204612.79538: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003220", "end": "2024-09-24 15:03:32.739385", "rc": 0, "start": "2024-09-24 15:03:32.736165" } STDOUT: bonding_masters eth0 lo 46400 1727204612.79630: no more pending results, returning what we have 46400 1727204612.79634: results queue empty 46400 1727204612.79635: checking for any_errors_fatal 46400 1727204612.79638: done checking for any_errors_fatal 46400 1727204612.79639: checking for max_fail_percentage 46400 1727204612.79641: done checking for max_fail_percentage 46400 1727204612.79642: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.79643: done checking to see if all hosts have failed 46400 1727204612.79643: getting the remaining hosts for this loop 46400 1727204612.79645: done getting the remaining hosts for this loop 46400 1727204612.79649: getting the next task for host managed-node2 46400 1727204612.79660: done getting next task for host managed-node2 46400 1727204612.79663: ^ task is: TASK: Set current_interfaces 46400 1727204612.79673: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.79677: getting variables 46400 1727204612.79678: in VariableManager get_vars() 46400 1727204612.79717: Calling all_inventory to load vars for managed-node2 46400 1727204612.79719: Calling groups_inventory to load vars for managed-node2 46400 1727204612.79723: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.79733: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.79736: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.79738: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.82012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.87428: done with get_vars() 46400 1727204612.87466: done getting variables 46400 1727204612.87527: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.425) 0:01:43.160 ***** 46400 1727204612.87569: entering _queue_task() for managed-node2/set_fact 46400 1727204612.87921: worker is 1 (out of 1 available) 46400 1727204612.87933: exiting _queue_task() for managed-node2/set_fact 46400 1727204612.87946: done queuing things up, now waiting for results queue to drain 46400 1727204612.87947: waiting for pending results... 46400 1727204612.88243: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 46400 1727204612.88410: in run() - task 0affcd87-79f5-1303-fda8-000000002112 46400 1727204612.88432: variable 'ansible_search_path' from source: unknown 46400 1727204612.88442: variable 'ansible_search_path' from source: unknown 46400 1727204612.88488: calling self._execute() 46400 1727204612.88589: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.88600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.88616: variable 'omit' from source: magic vars 46400 1727204612.89142: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.89170: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.89183: variable 'omit' from source: magic vars 46400 1727204612.89239: variable 'omit' from source: magic vars 46400 1727204612.89368: variable '_current_interfaces' from source: set_fact 46400 1727204612.89441: variable 'omit' from source: magic vars 46400 1727204612.89517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204612.89721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.89751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204612.89779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.89795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.89833: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.89877: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.89887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.90110: Set connection var ansible_shell_type to sh 46400 1727204612.90128: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.90180: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.90191: Set connection var ansible_connection to ssh 46400 1727204612.90202: Set connection var ansible_pipelining to False 46400 1727204612.90212: Set connection var ansible_timeout to 10 46400 1727204612.90274: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.90358: variable 'ansible_connection' from source: unknown 46400 1727204612.90373: variable 'ansible_module_compression' from source: unknown 46400 1727204612.90381: variable 'ansible_shell_type' from source: unknown 46400 1727204612.90389: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.90396: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.90405: variable 'ansible_pipelining' from source: unknown 46400 1727204612.90412: variable 'ansible_timeout' from source: unknown 46400 1727204612.90420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.90780: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.90801: variable 'omit' from source: magic vars 46400 1727204612.90811: starting attempt loop 46400 1727204612.90818: running the handler 46400 1727204612.90834: handler run complete 46400 1727204612.90850: attempt loop complete, returning result 46400 1727204612.90913: _execute() done 46400 1727204612.90921: dumping result to json 46400 1727204612.90929: done dumping result, returning 46400 1727204612.90940: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-1303-fda8-000000002112] 46400 1727204612.90951: sending task result for task 0affcd87-79f5-1303-fda8-000000002112 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 46400 1727204612.91181: no more pending results, returning what we have 46400 1727204612.91186: results queue empty 46400 1727204612.91187: checking for any_errors_fatal 46400 1727204612.91200: done checking for any_errors_fatal 46400 1727204612.91201: checking for max_fail_percentage 46400 1727204612.91203: done checking for max_fail_percentage 46400 1727204612.91204: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.91205: done checking to see if all hosts have failed 46400 1727204612.91206: getting the remaining hosts for this loop 46400 1727204612.91208: done getting the remaining hosts for this loop 46400 1727204612.91213: getting the next task for host managed-node2 46400 1727204612.91225: done getting next task for host managed-node2 46400 1727204612.91228: ^ task is: TASK: Show current_interfaces 46400 1727204612.91233: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.91239: getting variables 46400 1727204612.91240: in VariableManager get_vars() 46400 1727204612.91294: Calling all_inventory to load vars for managed-node2 46400 1727204612.91298: Calling groups_inventory to load vars for managed-node2 46400 1727204612.91302: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.91315: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.91318: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.91321: Calling groups_plugins_play to load vars for managed-node2 46400 1727204612.92397: done sending task result for task 0affcd87-79f5-1303-fda8-000000002112 46400 1727204612.92400: WORKER PROCESS EXITING 46400 1727204612.93322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204612.95796: done with get_vars() 46400 1727204612.95819: done getting variables 46400 1727204612.95887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.083) 0:01:43.243 ***** 46400 1727204612.95924: entering _queue_task() for managed-node2/debug 46400 1727204612.96273: worker is 1 (out of 1 available) 46400 1727204612.96287: exiting _queue_task() for managed-node2/debug 46400 1727204612.96301: done queuing things up, now waiting for results queue to drain 46400 1727204612.96302: waiting for pending results... 46400 1727204612.97053: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 46400 1727204612.97523: in run() - task 0affcd87-79f5-1303-fda8-0000000020d7 46400 1727204612.97546: variable 'ansible_search_path' from source: unknown 46400 1727204612.97556: variable 'ansible_search_path' from source: unknown 46400 1727204612.97723: calling self._execute() 46400 1727204612.98072: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.98107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.98142: variable 'omit' from source: magic vars 46400 1727204612.98571: variable 'ansible_distribution_major_version' from source: facts 46400 1727204612.98586: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204612.98595: variable 'omit' from source: magic vars 46400 1727204612.98652: variable 'omit' from source: magic vars 46400 1727204612.98756: variable 'current_interfaces' from source: set_fact 46400 1727204612.98792: variable 'omit' from source: magic vars 46400 1727204612.98840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204612.98886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204612.98913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204612.98939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.98955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204612.98992: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204612.99001: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.99008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.99110: Set connection var ansible_shell_type to sh 46400 1727204612.99125: Set connection var ansible_shell_executable to /bin/sh 46400 1727204612.99135: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204612.99148: Set connection var ansible_connection to ssh 46400 1727204612.99157: Set connection var ansible_pipelining to False 46400 1727204612.99172: Set connection var ansible_timeout to 10 46400 1727204612.99201: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.99208: variable 'ansible_connection' from source: unknown 46400 1727204612.99214: variable 'ansible_module_compression' from source: unknown 46400 1727204612.99220: variable 'ansible_shell_type' from source: unknown 46400 1727204612.99226: variable 'ansible_shell_executable' from source: unknown 46400 1727204612.99232: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204612.99239: variable 'ansible_pipelining' from source: unknown 46400 1727204612.99245: variable 'ansible_timeout' from source: unknown 46400 1727204612.99256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204612.99412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204612.99429: variable 'omit' from source: magic vars 46400 1727204612.99438: starting attempt loop 46400 1727204612.99444: running the handler 46400 1727204612.99501: handler run complete 46400 1727204612.99520: attempt loop complete, returning result 46400 1727204612.99526: _execute() done 46400 1727204612.99532: dumping result to json 46400 1727204612.99538: done dumping result, returning 46400 1727204612.99549: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-1303-fda8-0000000020d7] 46400 1727204612.99562: sending task result for task 0affcd87-79f5-1303-fda8-0000000020d7 46400 1727204612.99683: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020d7 ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 46400 1727204612.99732: no more pending results, returning what we have 46400 1727204612.99737: results queue empty 46400 1727204612.99738: checking for any_errors_fatal 46400 1727204612.99747: done checking for any_errors_fatal 46400 1727204612.99748: checking for max_fail_percentage 46400 1727204612.99750: done checking for max_fail_percentage 46400 1727204612.99751: checking to see if all hosts have failed and the running result is not ok 46400 1727204612.99752: done checking to see if all hosts have failed 46400 1727204612.99753: getting the remaining hosts for this loop 46400 1727204612.99754: done getting the remaining hosts for this loop 46400 1727204612.99762: getting the next task for host managed-node2 46400 1727204612.99774: done getting next task for host managed-node2 46400 1727204612.99778: ^ task is: TASK: Setup 46400 1727204612.99781: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204612.99796: getting variables 46400 1727204612.99798: in VariableManager get_vars() 46400 1727204612.99903: Calling all_inventory to load vars for managed-node2 46400 1727204612.99906: Calling groups_inventory to load vars for managed-node2 46400 1727204612.99910: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204612.99923: Calling all_plugins_play to load vars for managed-node2 46400 1727204612.99933: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204612.99938: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.01786: WORKER PROCESS EXITING 46400 1727204613.02833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.04825: done with get_vars() 46400 1727204613.04868: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.090) 0:01:43.334 ***** 46400 1727204613.04978: entering _queue_task() for managed-node2/include_tasks 46400 1727204613.05410: worker is 1 (out of 1 available) 46400 1727204613.05428: exiting _queue_task() for managed-node2/include_tasks 46400 1727204613.05454: done queuing things up, now waiting for results queue to drain 46400 1727204613.05456: waiting for pending results... 46400 1727204613.05767: running TaskExecutor() for managed-node2/TASK: Setup 46400 1727204613.05893: in run() - task 0affcd87-79f5-1303-fda8-0000000020b0 46400 1727204613.06026: variable 'ansible_search_path' from source: unknown 46400 1727204613.06033: variable 'ansible_search_path' from source: unknown 46400 1727204613.06087: variable 'lsr_setup' from source: include params 46400 1727204613.06557: variable 'lsr_setup' from source: include params 46400 1727204613.06640: variable 'omit' from source: magic vars 46400 1727204613.07051: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.07073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.07110: variable 'omit' from source: magic vars 46400 1727204613.07669: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.07685: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.07696: variable 'item' from source: unknown 46400 1727204613.07784: variable 'item' from source: unknown 46400 1727204613.07825: variable 'item' from source: unknown 46400 1727204613.07898: variable 'item' from source: unknown 46400 1727204613.08104: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.08117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.08131: variable 'omit' from source: magic vars 46400 1727204613.08307: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.08318: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.08327: variable 'item' from source: unknown 46400 1727204613.08398: variable 'item' from source: unknown 46400 1727204613.08435: variable 'item' from source: unknown 46400 1727204613.08507: variable 'item' from source: unknown 46400 1727204613.08641: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.08653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.08670: variable 'omit' from source: magic vars 46400 1727204613.08833: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.08845: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.08853: variable 'item' from source: unknown 46400 1727204613.08924: variable 'item' from source: unknown 46400 1727204613.08957: variable 'item' from source: unknown 46400 1727204613.09028: variable 'item' from source: unknown 46400 1727204613.09111: dumping result to json 46400 1727204613.09120: done dumping result, returning 46400 1727204613.09133: done running TaskExecutor() for managed-node2/TASK: Setup [0affcd87-79f5-1303-fda8-0000000020b0] 46400 1727204613.09144: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b0 46400 1727204613.09215: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b0 46400 1727204613.09224: WORKER PROCESS EXITING 46400 1727204613.09269: no more pending results, returning what we have 46400 1727204613.09276: in VariableManager get_vars() 46400 1727204613.09332: Calling all_inventory to load vars for managed-node2 46400 1727204613.09335: Calling groups_inventory to load vars for managed-node2 46400 1727204613.09340: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.09355: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.09358: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.09366: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.11416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.13049: done with get_vars() 46400 1727204613.13079: variable 'ansible_search_path' from source: unknown 46400 1727204613.13081: variable 'ansible_search_path' from source: unknown 46400 1727204613.13124: variable 'ansible_search_path' from source: unknown 46400 1727204613.13125: variable 'ansible_search_path' from source: unknown 46400 1727204613.13154: variable 'ansible_search_path' from source: unknown 46400 1727204613.13155: variable 'ansible_search_path' from source: unknown 46400 1727204613.13189: we have included files to process 46400 1727204613.13190: generating all_blocks data 46400 1727204613.13193: done generating all_blocks data 46400 1727204613.13197: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204613.13198: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204613.13201: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 46400 1727204613.13450: done processing included file 46400 1727204613.13452: iterating over new_blocks loaded from include file 46400 1727204613.13454: in VariableManager get_vars() 46400 1727204613.13476: done with get_vars() 46400 1727204613.13478: filtering new block on tags 46400 1727204613.13516: done filtering new block on tags 46400 1727204613.13519: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 46400 1727204613.13524: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204613.13525: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204613.13528: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 46400 1727204613.13625: done processing included file 46400 1727204613.13626: iterating over new_blocks loaded from include file 46400 1727204613.13628: in VariableManager get_vars() 46400 1727204613.13644: done with get_vars() 46400 1727204613.13646: filtering new block on tags 46400 1727204613.13672: done filtering new block on tags 46400 1727204613.13674: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 46400 1727204613.13678: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204613.13679: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204613.13682: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204613.13776: done processing included file 46400 1727204613.13778: iterating over new_blocks loaded from include file 46400 1727204613.13779: in VariableManager get_vars() 46400 1727204613.13795: done with get_vars() 46400 1727204613.13796: filtering new block on tags 46400 1727204613.13817: done filtering new block on tags 46400 1727204613.13819: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 46400 1727204613.13822: extending task lists for all hosts with included blocks 46400 1727204613.14518: done extending task lists 46400 1727204613.14519: done processing included files 46400 1727204613.14520: results queue empty 46400 1727204613.14521: checking for any_errors_fatal 46400 1727204613.14525: done checking for any_errors_fatal 46400 1727204613.14525: checking for max_fail_percentage 46400 1727204613.14527: done checking for max_fail_percentage 46400 1727204613.14527: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.14528: done checking to see if all hosts have failed 46400 1727204613.14529: getting the remaining hosts for this loop 46400 1727204613.14530: done getting the remaining hosts for this loop 46400 1727204613.14533: getting the next task for host managed-node2 46400 1727204613.14537: done getting next task for host managed-node2 46400 1727204613.14539: ^ task is: TASK: Include network role 46400 1727204613.14541: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.14544: getting variables 46400 1727204613.14545: in VariableManager get_vars() 46400 1727204613.14556: Calling all_inventory to load vars for managed-node2 46400 1727204613.14560: Calling groups_inventory to load vars for managed-node2 46400 1727204613.14563: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.14571: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.14573: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.14576: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.15751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.17442: done with get_vars() 46400 1727204613.17480: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.125) 0:01:43.460 ***** 46400 1727204613.17576: entering _queue_task() for managed-node2/include_role 46400 1727204613.17949: worker is 1 (out of 1 available) 46400 1727204613.17965: exiting _queue_task() for managed-node2/include_role 46400 1727204613.17978: done queuing things up, now waiting for results queue to drain 46400 1727204613.17980: waiting for pending results... 46400 1727204613.18281: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204613.18427: in run() - task 0affcd87-79f5-1303-fda8-000000002139 46400 1727204613.18452: variable 'ansible_search_path' from source: unknown 46400 1727204613.18463: variable 'ansible_search_path' from source: unknown 46400 1727204613.18508: calling self._execute() 46400 1727204613.18626: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.18644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.18663: variable 'omit' from source: magic vars 46400 1727204613.19068: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.19092: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.19105: _execute() done 46400 1727204613.19112: dumping result to json 46400 1727204613.19119: done dumping result, returning 46400 1727204613.19128: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000002139] 46400 1727204613.19139: sending task result for task 0affcd87-79f5-1303-fda8-000000002139 46400 1727204613.19320: no more pending results, returning what we have 46400 1727204613.19326: in VariableManager get_vars() 46400 1727204613.19388: Calling all_inventory to load vars for managed-node2 46400 1727204613.19392: Calling groups_inventory to load vars for managed-node2 46400 1727204613.19396: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.19411: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.19414: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.19417: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.20480: done sending task result for task 0affcd87-79f5-1303-fda8-000000002139 46400 1727204613.20483: WORKER PROCESS EXITING 46400 1727204613.21335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.23052: done with get_vars() 46400 1727204613.23085: variable 'ansible_search_path' from source: unknown 46400 1727204613.23087: variable 'ansible_search_path' from source: unknown 46400 1727204613.23302: variable 'omit' from source: magic vars 46400 1727204613.23343: variable 'omit' from source: magic vars 46400 1727204613.23362: variable 'omit' from source: magic vars 46400 1727204613.23369: we have included files to process 46400 1727204613.23370: generating all_blocks data 46400 1727204613.23372: done generating all_blocks data 46400 1727204613.23373: processing included file: fedora.linux_system_roles.network 46400 1727204613.23396: in VariableManager get_vars() 46400 1727204613.23414: done with get_vars() 46400 1727204613.23444: in VariableManager get_vars() 46400 1727204613.23469: done with get_vars() 46400 1727204613.23509: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204613.23641: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204613.23728: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204613.24222: in VariableManager get_vars() 46400 1727204613.24245: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204613.26294: iterating over new_blocks loaded from include file 46400 1727204613.26297: in VariableManager get_vars() 46400 1727204613.26318: done with get_vars() 46400 1727204613.26320: filtering new block on tags 46400 1727204613.26622: done filtering new block on tags 46400 1727204613.26625: in VariableManager get_vars() 46400 1727204613.26644: done with get_vars() 46400 1727204613.26646: filtering new block on tags 46400 1727204613.26667: done filtering new block on tags 46400 1727204613.26669: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204613.26675: extending task lists for all hosts with included blocks 46400 1727204613.26844: done extending task lists 46400 1727204613.26845: done processing included files 46400 1727204613.26846: results queue empty 46400 1727204613.26847: checking for any_errors_fatal 46400 1727204613.26851: done checking for any_errors_fatal 46400 1727204613.26852: checking for max_fail_percentage 46400 1727204613.26853: done checking for max_fail_percentage 46400 1727204613.26854: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.26855: done checking to see if all hosts have failed 46400 1727204613.26855: getting the remaining hosts for this loop 46400 1727204613.26857: done getting the remaining hosts for this loop 46400 1727204613.26862: getting the next task for host managed-node2 46400 1727204613.26870: done getting next task for host managed-node2 46400 1727204613.26873: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204613.26876: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.26887: getting variables 46400 1727204613.26888: in VariableManager get_vars() 46400 1727204613.26903: Calling all_inventory to load vars for managed-node2 46400 1727204613.26906: Calling groups_inventory to load vars for managed-node2 46400 1727204613.26907: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.26913: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.26915: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.26918: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.28182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.29116: done with get_vars() 46400 1727204613.29139: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.116) 0:01:43.576 ***** 46400 1727204613.29206: entering _queue_task() for managed-node2/include_tasks 46400 1727204613.29472: worker is 1 (out of 1 available) 46400 1727204613.29487: exiting _queue_task() for managed-node2/include_tasks 46400 1727204613.29500: done queuing things up, now waiting for results queue to drain 46400 1727204613.29502: waiting for pending results... 46400 1727204613.29799: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204613.29971: in run() - task 0affcd87-79f5-1303-fda8-0000000021a3 46400 1727204613.29975: variable 'ansible_search_path' from source: unknown 46400 1727204613.29978: variable 'ansible_search_path' from source: unknown 46400 1727204613.29981: calling self._execute() 46400 1727204613.30069: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.30081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.30088: variable 'omit' from source: magic vars 46400 1727204613.30478: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.30490: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.30502: _execute() done 46400 1727204613.30506: dumping result to json 46400 1727204613.30510: done dumping result, returning 46400 1727204613.30515: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-0000000021a3] 46400 1727204613.30522: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a3 46400 1727204613.30684: no more pending results, returning what we have 46400 1727204613.30690: in VariableManager get_vars() 46400 1727204613.30753: Calling all_inventory to load vars for managed-node2 46400 1727204613.30757: Calling groups_inventory to load vars for managed-node2 46400 1727204613.30759: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.30775: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.30780: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.30785: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.31307: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a3 46400 1727204613.31311: WORKER PROCESS EXITING 46400 1727204613.31900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.32840: done with get_vars() 46400 1727204613.32857: variable 'ansible_search_path' from source: unknown 46400 1727204613.32858: variable 'ansible_search_path' from source: unknown 46400 1727204613.32889: we have included files to process 46400 1727204613.32890: generating all_blocks data 46400 1727204613.32892: done generating all_blocks data 46400 1727204613.32894: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204613.32894: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204613.32896: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204613.33501: done processing included file 46400 1727204613.33503: iterating over new_blocks loaded from include file 46400 1727204613.33505: in VariableManager get_vars() 46400 1727204613.33539: done with get_vars() 46400 1727204613.33541: filtering new block on tags 46400 1727204613.33576: done filtering new block on tags 46400 1727204613.33580: in VariableManager get_vars() 46400 1727204613.33611: done with get_vars() 46400 1727204613.33613: filtering new block on tags 46400 1727204613.33660: done filtering new block on tags 46400 1727204613.33665: in VariableManager get_vars() 46400 1727204613.33705: done with get_vars() 46400 1727204613.33708: filtering new block on tags 46400 1727204613.33754: done filtering new block on tags 46400 1727204613.33756: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204613.33761: extending task lists for all hosts with included blocks 46400 1727204613.35992: done extending task lists 46400 1727204613.35994: done processing included files 46400 1727204613.35995: results queue empty 46400 1727204613.35996: checking for any_errors_fatal 46400 1727204613.35999: done checking for any_errors_fatal 46400 1727204613.36000: checking for max_fail_percentage 46400 1727204613.36002: done checking for max_fail_percentage 46400 1727204613.36002: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.36003: done checking to see if all hosts have failed 46400 1727204613.36004: getting the remaining hosts for this loop 46400 1727204613.36006: done getting the remaining hosts for this loop 46400 1727204613.36008: getting the next task for host managed-node2 46400 1727204613.36014: done getting next task for host managed-node2 46400 1727204613.36018: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204613.36022: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.36035: getting variables 46400 1727204613.36036: in VariableManager get_vars() 46400 1727204613.36058: Calling all_inventory to load vars for managed-node2 46400 1727204613.36065: Calling groups_inventory to load vars for managed-node2 46400 1727204613.36068: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.36075: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.36078: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.36081: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.36928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.38195: done with get_vars() 46400 1727204613.38220: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.091) 0:01:43.667 ***** 46400 1727204613.38316: entering _queue_task() for managed-node2/setup 46400 1727204613.38693: worker is 1 (out of 1 available) 46400 1727204613.38707: exiting _queue_task() for managed-node2/setup 46400 1727204613.38721: done queuing things up, now waiting for results queue to drain 46400 1727204613.38722: waiting for pending results... 46400 1727204613.39032: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204613.39220: in run() - task 0affcd87-79f5-1303-fda8-000000002200 46400 1727204613.39247: variable 'ansible_search_path' from source: unknown 46400 1727204613.39262: variable 'ansible_search_path' from source: unknown 46400 1727204613.39314: calling self._execute() 46400 1727204613.39713: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.39728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.39743: variable 'omit' from source: magic vars 46400 1727204613.40110: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.40126: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.40349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204613.42911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204613.42989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204613.43035: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204613.43077: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204613.43110: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204613.43193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204613.43229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204613.43260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204613.43309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204613.43329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204613.43388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204613.43416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204613.43448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204613.43495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204613.43513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204613.43695: variable '__network_required_facts' from source: role '' defaults 46400 1727204613.43709: variable 'ansible_facts' from source: unknown 46400 1727204613.44846: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204613.44857: when evaluation is False, skipping this task 46400 1727204613.44866: _execute() done 46400 1727204613.44874: dumping result to json 46400 1727204613.44880: done dumping result, returning 46400 1727204613.44890: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-000000002200] 46400 1727204613.44905: sending task result for task 0affcd87-79f5-1303-fda8-000000002200 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204613.45050: no more pending results, returning what we have 46400 1727204613.45056: results queue empty 46400 1727204613.45057: checking for any_errors_fatal 46400 1727204613.45059: done checking for any_errors_fatal 46400 1727204613.45059: checking for max_fail_percentage 46400 1727204613.45061: done checking for max_fail_percentage 46400 1727204613.45062: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.45063: done checking to see if all hosts have failed 46400 1727204613.45066: getting the remaining hosts for this loop 46400 1727204613.45068: done getting the remaining hosts for this loop 46400 1727204613.45073: getting the next task for host managed-node2 46400 1727204613.45086: done getting next task for host managed-node2 46400 1727204613.45091: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204613.45097: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.45120: getting variables 46400 1727204613.45122: in VariableManager get_vars() 46400 1727204613.45176: Calling all_inventory to load vars for managed-node2 46400 1727204613.45179: Calling groups_inventory to load vars for managed-node2 46400 1727204613.45182: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.45193: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.45196: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.45198: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.46259: done sending task result for task 0affcd87-79f5-1303-fda8-000000002200 46400 1727204613.46271: WORKER PROCESS EXITING 46400 1727204613.47270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.48982: done with get_vars() 46400 1727204613.49013: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.108) 0:01:43.775 ***** 46400 1727204613.49120: entering _queue_task() for managed-node2/stat 46400 1727204613.49480: worker is 1 (out of 1 available) 46400 1727204613.49494: exiting _queue_task() for managed-node2/stat 46400 1727204613.49508: done queuing things up, now waiting for results queue to drain 46400 1727204613.49509: waiting for pending results... 46400 1727204613.49807: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204613.49986: in run() - task 0affcd87-79f5-1303-fda8-000000002202 46400 1727204613.50007: variable 'ansible_search_path' from source: unknown 46400 1727204613.50014: variable 'ansible_search_path' from source: unknown 46400 1727204613.50057: calling self._execute() 46400 1727204613.50157: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.50173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.50187: variable 'omit' from source: magic vars 46400 1727204613.50558: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.50577: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.50746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204613.51027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204613.51082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204613.51118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204613.51158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204613.51247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204613.51283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204613.51314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204613.51343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204613.51440: variable '__network_is_ostree' from source: set_fact 46400 1727204613.51451: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204613.51458: when evaluation is False, skipping this task 46400 1727204613.51466: _execute() done 46400 1727204613.51475: dumping result to json 46400 1727204613.51481: done dumping result, returning 46400 1727204613.51491: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000002202] 46400 1727204613.51501: sending task result for task 0affcd87-79f5-1303-fda8-000000002202 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204613.51656: no more pending results, returning what we have 46400 1727204613.51661: results queue empty 46400 1727204613.51662: checking for any_errors_fatal 46400 1727204613.51674: done checking for any_errors_fatal 46400 1727204613.51675: checking for max_fail_percentage 46400 1727204613.51677: done checking for max_fail_percentage 46400 1727204613.51678: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.51679: done checking to see if all hosts have failed 46400 1727204613.51680: getting the remaining hosts for this loop 46400 1727204613.51682: done getting the remaining hosts for this loop 46400 1727204613.51686: getting the next task for host managed-node2 46400 1727204613.51697: done getting next task for host managed-node2 46400 1727204613.51701: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204613.51707: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.51729: getting variables 46400 1727204613.51731: in VariableManager get_vars() 46400 1727204613.51784: Calling all_inventory to load vars for managed-node2 46400 1727204613.51787: Calling groups_inventory to load vars for managed-node2 46400 1727204613.51789: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.51800: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.51803: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.51807: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.52784: done sending task result for task 0affcd87-79f5-1303-fda8-000000002202 46400 1727204613.52787: WORKER PROCESS EXITING 46400 1727204613.53557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.55277: done with get_vars() 46400 1727204613.55303: done getting variables 46400 1727204613.55367: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.062) 0:01:43.838 ***** 46400 1727204613.55406: entering _queue_task() for managed-node2/set_fact 46400 1727204613.55775: worker is 1 (out of 1 available) 46400 1727204613.55787: exiting _queue_task() for managed-node2/set_fact 46400 1727204613.55803: done queuing things up, now waiting for results queue to drain 46400 1727204613.55805: waiting for pending results... 46400 1727204613.56127: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204613.56316: in run() - task 0affcd87-79f5-1303-fda8-000000002203 46400 1727204613.56341: variable 'ansible_search_path' from source: unknown 46400 1727204613.56349: variable 'ansible_search_path' from source: unknown 46400 1727204613.56396: calling self._execute() 46400 1727204613.56499: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.56510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.56523: variable 'omit' from source: magic vars 46400 1727204613.56882: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.56903: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.57066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204613.57350: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204613.57400: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204613.57444: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204613.57485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204613.57575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204613.57603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204613.57632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204613.57669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204613.57756: variable '__network_is_ostree' from source: set_fact 46400 1727204613.57774: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204613.57782: when evaluation is False, skipping this task 46400 1727204613.57787: _execute() done 46400 1727204613.57793: dumping result to json 46400 1727204613.57799: done dumping result, returning 46400 1727204613.57808: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000002203] 46400 1727204613.57817: sending task result for task 0affcd87-79f5-1303-fda8-000000002203 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204613.57954: no more pending results, returning what we have 46400 1727204613.57959: results queue empty 46400 1727204613.57960: checking for any_errors_fatal 46400 1727204613.57970: done checking for any_errors_fatal 46400 1727204613.57971: checking for max_fail_percentage 46400 1727204613.57973: done checking for max_fail_percentage 46400 1727204613.57974: checking to see if all hosts have failed and the running result is not ok 46400 1727204613.57974: done checking to see if all hosts have failed 46400 1727204613.57975: getting the remaining hosts for this loop 46400 1727204613.57977: done getting the remaining hosts for this loop 46400 1727204613.57981: getting the next task for host managed-node2 46400 1727204613.57997: done getting next task for host managed-node2 46400 1727204613.58001: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204613.58006: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204613.58028: getting variables 46400 1727204613.58030: in VariableManager get_vars() 46400 1727204613.58084: Calling all_inventory to load vars for managed-node2 46400 1727204613.58087: Calling groups_inventory to load vars for managed-node2 46400 1727204613.58090: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204613.58101: Calling all_plugins_play to load vars for managed-node2 46400 1727204613.58103: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204613.58106: Calling groups_plugins_play to load vars for managed-node2 46400 1727204613.59084: done sending task result for task 0affcd87-79f5-1303-fda8-000000002203 46400 1727204613.59088: WORKER PROCESS EXITING 46400 1727204613.60150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204613.61806: done with get_vars() 46400 1727204613.61840: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:33 -0400 (0:00:00.065) 0:01:43.904 ***** 46400 1727204613.61944: entering _queue_task() for managed-node2/service_facts 46400 1727204613.62302: worker is 1 (out of 1 available) 46400 1727204613.62315: exiting _queue_task() for managed-node2/service_facts 46400 1727204613.62328: done queuing things up, now waiting for results queue to drain 46400 1727204613.62330: waiting for pending results... 46400 1727204613.62629: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204613.62813: in run() - task 0affcd87-79f5-1303-fda8-000000002205 46400 1727204613.62836: variable 'ansible_search_path' from source: unknown 46400 1727204613.62846: variable 'ansible_search_path' from source: unknown 46400 1727204613.62893: calling self._execute() 46400 1727204613.62995: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.63005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.63019: variable 'omit' from source: magic vars 46400 1727204613.63387: variable 'ansible_distribution_major_version' from source: facts 46400 1727204613.63405: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204613.63416: variable 'omit' from source: magic vars 46400 1727204613.63500: variable 'omit' from source: magic vars 46400 1727204613.63545: variable 'omit' from source: magic vars 46400 1727204613.63595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204613.63638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204613.63670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204613.63691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204613.63706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204613.63740: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204613.63750: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.63760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.63862: Set connection var ansible_shell_type to sh 46400 1727204613.63883: Set connection var ansible_shell_executable to /bin/sh 46400 1727204613.63894: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204613.63903: Set connection var ansible_connection to ssh 46400 1727204613.63912: Set connection var ansible_pipelining to False 46400 1727204613.63921: Set connection var ansible_timeout to 10 46400 1727204613.63950: variable 'ansible_shell_executable' from source: unknown 46400 1727204613.63957: variable 'ansible_connection' from source: unknown 46400 1727204613.63967: variable 'ansible_module_compression' from source: unknown 46400 1727204613.63976: variable 'ansible_shell_type' from source: unknown 46400 1727204613.63985: variable 'ansible_shell_executable' from source: unknown 46400 1727204613.63991: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204613.63998: variable 'ansible_pipelining' from source: unknown 46400 1727204613.64004: variable 'ansible_timeout' from source: unknown 46400 1727204613.64011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204613.64216: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204613.64233: variable 'omit' from source: magic vars 46400 1727204613.64242: starting attempt loop 46400 1727204613.64248: running the handler 46400 1727204613.64267: _low_level_execute_command(): starting 46400 1727204613.64281: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204613.65048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204613.65068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.65086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.65106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.65150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204613.65162: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204613.65180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.65200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204613.65212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204613.65223: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204613.65235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.65250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.65269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.65282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204613.65296: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204613.65310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.65379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204613.65406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204613.65424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204613.65504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204613.67175: stdout chunk (state=3): >>>/root <<< 46400 1727204613.67283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204613.67367: stderr chunk (state=3): >>><<< 46400 1727204613.67382: stdout chunk (state=3): >>><<< 46400 1727204613.67504: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204613.67508: _low_level_execute_command(): starting 46400 1727204613.67510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249 `" && echo ansible-tmp-1727204613.674095-53394-82334910474249="` echo /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249 `" ) && sleep 0' 46400 1727204613.68092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.68095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.68133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.68138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.68236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204613.68274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204613.70137: stdout chunk (state=3): >>>ansible-tmp-1727204613.674095-53394-82334910474249=/root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249 <<< 46400 1727204613.70247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204613.70335: stderr chunk (state=3): >>><<< 46400 1727204613.70339: stdout chunk (state=3): >>><<< 46400 1727204613.70373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204613.674095-53394-82334910474249=/root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204613.70574: variable 'ansible_module_compression' from source: unknown 46400 1727204613.70578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204613.70580: variable 'ansible_facts' from source: unknown 46400 1727204613.70605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/AnsiballZ_service_facts.py 46400 1727204613.70770: Sending initial data 46400 1727204613.70773: Sent initial data (160 bytes) 46400 1727204613.72071: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204613.72089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.72110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.72133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.72181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204613.72194: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204613.72209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.72236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204613.72250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204613.72267: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204613.72281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.72296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.72313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.72330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204613.72346: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204613.72367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.72448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204613.72472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204613.72488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204613.72556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204613.74277: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204613.74344: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204613.74392: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmppg1dthnu /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/AnsiballZ_service_facts.py <<< 46400 1727204613.74720: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204613.75449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204613.75569: stderr chunk (state=3): >>><<< 46400 1727204613.75572: stdout chunk (state=3): >>><<< 46400 1727204613.75575: done transferring module to remote 46400 1727204613.75585: _low_level_execute_command(): starting 46400 1727204613.75590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/ /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/AnsiballZ_service_facts.py && sleep 0' 46400 1727204613.76053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.76057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.76094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.76102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.76104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.76156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204613.76160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204613.76162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204613.76202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204613.77922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204613.78008: stderr chunk (state=3): >>><<< 46400 1727204613.78017: stdout chunk (state=3): >>><<< 46400 1727204613.78113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204613.78116: _low_level_execute_command(): starting 46400 1727204613.78118: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/AnsiballZ_service_facts.py && sleep 0' 46400 1727204613.78622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204613.78628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204613.78681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204613.78685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204613.78687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204613.78731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204613.78739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204613.78800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.07991: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204615.08053: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204615.09370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204615.09374: stdout chunk (state=3): >>><<< 46400 1727204615.09387: stderr chunk (state=3): >>><<< 46400 1727204615.09405: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204615.10734: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204615.10831: _low_level_execute_command(): starting 46400 1727204615.10835: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204613.674095-53394-82334910474249/ > /dev/null 2>&1 && sleep 0' 46400 1727204615.12355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204615.12369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.12380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.12395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.12462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.12481: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.12491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.12505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.12512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204615.12519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204615.12527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.12536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.12547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.12556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.12566: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204615.12585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.12647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.12663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.12671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.12758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.14534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204615.14666: stderr chunk (state=3): >>><<< 46400 1727204615.14695: stdout chunk (state=3): >>><<< 46400 1727204615.14802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204615.14808: handler run complete 46400 1727204615.15119: variable 'ansible_facts' from source: unknown 46400 1727204615.15324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204615.15974: variable 'ansible_facts' from source: unknown 46400 1727204615.16161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204615.16447: attempt loop complete, returning result 46400 1727204615.16498: _execute() done 46400 1727204615.16518: dumping result to json 46400 1727204615.16620: done dumping result, returning 46400 1727204615.16644: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000002205] 46400 1727204615.16665: sending task result for task 0affcd87-79f5-1303-fda8-000000002205 46400 1727204615.18068: done sending task result for task 0affcd87-79f5-1303-fda8-000000002205 46400 1727204615.18088: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204615.18290: no more pending results, returning what we have 46400 1727204615.18295: results queue empty 46400 1727204615.18296: checking for any_errors_fatal 46400 1727204615.18304: done checking for any_errors_fatal 46400 1727204615.18305: checking for max_fail_percentage 46400 1727204615.18307: done checking for max_fail_percentage 46400 1727204615.18308: checking to see if all hosts have failed and the running result is not ok 46400 1727204615.18309: done checking to see if all hosts have failed 46400 1727204615.18310: getting the remaining hosts for this loop 46400 1727204615.18312: done getting the remaining hosts for this loop 46400 1727204615.18318: getting the next task for host managed-node2 46400 1727204615.18329: done getting next task for host managed-node2 46400 1727204615.18335: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204615.18342: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204615.18367: getting variables 46400 1727204615.18371: in VariableManager get_vars() 46400 1727204615.18429: Calling all_inventory to load vars for managed-node2 46400 1727204615.18433: Calling groups_inventory to load vars for managed-node2 46400 1727204615.18436: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204615.18452: Calling all_plugins_play to load vars for managed-node2 46400 1727204615.18457: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204615.18461: Calling groups_plugins_play to load vars for managed-node2 46400 1727204615.21003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204615.23168: done with get_vars() 46400 1727204615.23216: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:35 -0400 (0:00:01.614) 0:01:45.518 ***** 46400 1727204615.23352: entering _queue_task() for managed-node2/package_facts 46400 1727204615.23823: worker is 1 (out of 1 available) 46400 1727204615.23837: exiting _queue_task() for managed-node2/package_facts 46400 1727204615.23850: done queuing things up, now waiting for results queue to drain 46400 1727204615.23852: waiting for pending results... 46400 1727204615.24188: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204615.24357: in run() - task 0affcd87-79f5-1303-fda8-000000002206 46400 1727204615.24388: variable 'ansible_search_path' from source: unknown 46400 1727204615.24396: variable 'ansible_search_path' from source: unknown 46400 1727204615.24441: calling self._execute() 46400 1727204615.24592: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204615.24604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204615.24618: variable 'omit' from source: magic vars 46400 1727204615.25114: variable 'ansible_distribution_major_version' from source: facts 46400 1727204615.25135: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204615.25146: variable 'omit' from source: magic vars 46400 1727204615.25326: variable 'omit' from source: magic vars 46400 1727204615.25370: variable 'omit' from source: magic vars 46400 1727204615.25438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204615.25493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204615.25523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204615.25552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204615.25574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204615.25611: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204615.25619: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204615.25629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204615.25748: Set connection var ansible_shell_type to sh 46400 1727204615.25773: Set connection var ansible_shell_executable to /bin/sh 46400 1727204615.25791: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204615.25811: Set connection var ansible_connection to ssh 46400 1727204615.25842: Set connection var ansible_pipelining to False 46400 1727204615.25858: Set connection var ansible_timeout to 10 46400 1727204615.25946: variable 'ansible_shell_executable' from source: unknown 46400 1727204615.25963: variable 'ansible_connection' from source: unknown 46400 1727204615.25973: variable 'ansible_module_compression' from source: unknown 46400 1727204615.25986: variable 'ansible_shell_type' from source: unknown 46400 1727204615.25996: variable 'ansible_shell_executable' from source: unknown 46400 1727204615.26012: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204615.26027: variable 'ansible_pipelining' from source: unknown 46400 1727204615.26034: variable 'ansible_timeout' from source: unknown 46400 1727204615.26046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204615.26390: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204615.26406: variable 'omit' from source: magic vars 46400 1727204615.26421: starting attempt loop 46400 1727204615.26428: running the handler 46400 1727204615.26450: _low_level_execute_command(): starting 46400 1727204615.26461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204615.27641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204615.27667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.27692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.27720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.27786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.27798: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.27811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.27829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.27842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204615.27857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204615.27880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.27906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.27922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.27934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.27945: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204615.27967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.28058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.28112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.28139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.28234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.29855: stdout chunk (state=3): >>>/root <<< 46400 1727204615.30065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204615.30196: stderr chunk (state=3): >>><<< 46400 1727204615.30209: stdout chunk (state=3): >>><<< 46400 1727204615.30346: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204615.30349: _low_level_execute_command(): starting 46400 1727204615.30352: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224 `" && echo ansible-tmp-1727204615.3024096-53568-34604674385224="` echo /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224 `" ) && sleep 0' 46400 1727204615.31005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204615.31021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.31035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.31051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.31102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.31119: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.31134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.31151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.31163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204615.31176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204615.31188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.31200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.31223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.31240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.31250: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204615.31263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.31353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.31378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.31393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.31471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.33317: stdout chunk (state=3): >>>ansible-tmp-1727204615.3024096-53568-34604674385224=/root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224 <<< 46400 1727204615.33423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204615.33520: stderr chunk (state=3): >>><<< 46400 1727204615.33530: stdout chunk (state=3): >>><<< 46400 1727204615.33777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204615.3024096-53568-34604674385224=/root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204615.33780: variable 'ansible_module_compression' from source: unknown 46400 1727204615.33783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204615.33785: variable 'ansible_facts' from source: unknown 46400 1727204615.33947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/AnsiballZ_package_facts.py 46400 1727204615.34136: Sending initial data 46400 1727204615.34140: Sent initial data (161 bytes) 46400 1727204615.35232: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204615.35247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.35261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.35282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.35337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.35348: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.35361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.35381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.35392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204615.35412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204615.35427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.35440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.35455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.35469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.35480: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204615.35492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.35582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.35603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.35623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.35700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.37421: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204615.37459: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204615.37499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpjr4uoj35 /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/AnsiballZ_package_facts.py <<< 46400 1727204615.37533: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204615.39736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204615.39935: stderr chunk (state=3): >>><<< 46400 1727204615.39939: stdout chunk (state=3): >>><<< 46400 1727204615.39958: done transferring module to remote 46400 1727204615.39975: _low_level_execute_command(): starting 46400 1727204615.39980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/ /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/AnsiballZ_package_facts.py && sleep 0' 46400 1727204615.40677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204615.40686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.40696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.40715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.40757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.40768: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.40777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.40789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.40796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204615.40802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204615.40809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204615.40821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.40838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.40844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.40851: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204615.40859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.40937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.40960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.40976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.41045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.42795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204615.42817: stderr chunk (state=3): >>><<< 46400 1727204615.42820: stdout chunk (state=3): >>><<< 46400 1727204615.42834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204615.42836: _low_level_execute_command(): starting 46400 1727204615.42843: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/AnsiballZ_package_facts.py && sleep 0' 46400 1727204615.43297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.43301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.43335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204615.43349: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204615.43357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.43374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204615.43380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204615.43391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204615.43396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204615.43446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204615.43472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204615.43485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204615.43532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204615.89980: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204615.90010: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204615.90015: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 46400 1727204615.90026: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204615.90029: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 46400 1727204615.90036: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204615.90098: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204615.90111: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204615.90119: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204615.90122: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204615.90138: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204615.90145: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204615.90149: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204615.91770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204615.91774: stdout chunk (state=3): >>><<< 46400 1727204615.91777: stderr chunk (state=3): >>><<< 46400 1727204615.91862: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204616.01862: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204616.01884: _low_level_execute_command(): starting 46400 1727204616.01888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204615.3024096-53568-34604674385224/ > /dev/null 2>&1 && sleep 0' 46400 1727204616.02693: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204616.02697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204616.02700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204616.02703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204616.02705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204616.02707: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204616.02710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204616.02712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204616.02714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204616.02716: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204616.02718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204616.02729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204616.02735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204616.02744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204616.02751: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204616.02761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204616.02850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204616.02855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204616.02862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204616.02937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204616.04752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204616.04807: stderr chunk (state=3): >>><<< 46400 1727204616.04812: stdout chunk (state=3): >>><<< 46400 1727204616.04869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204616.04873: handler run complete 46400 1727204616.05773: variable 'ansible_facts' from source: unknown 46400 1727204616.06276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.10322: variable 'ansible_facts' from source: unknown 46400 1727204616.11512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.14262: attempt loop complete, returning result 46400 1727204616.14357: _execute() done 46400 1727204616.14457: dumping result to json 46400 1727204616.15081: done dumping result, returning 46400 1727204616.15101: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000002206] 46400 1727204616.15112: sending task result for task 0affcd87-79f5-1303-fda8-000000002206 46400 1727204616.37419: done sending task result for task 0affcd87-79f5-1303-fda8-000000002206 46400 1727204616.37429: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204616.37521: no more pending results, returning what we have 46400 1727204616.37523: results queue empty 46400 1727204616.37524: checking for any_errors_fatal 46400 1727204616.37528: done checking for any_errors_fatal 46400 1727204616.37529: checking for max_fail_percentage 46400 1727204616.37530: done checking for max_fail_percentage 46400 1727204616.37531: checking to see if all hosts have failed and the running result is not ok 46400 1727204616.37532: done checking to see if all hosts have failed 46400 1727204616.37533: getting the remaining hosts for this loop 46400 1727204616.37534: done getting the remaining hosts for this loop 46400 1727204616.37540: getting the next task for host managed-node2 46400 1727204616.37547: done getting next task for host managed-node2 46400 1727204616.37550: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204616.37558: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204616.37573: getting variables 46400 1727204616.37575: in VariableManager get_vars() 46400 1727204616.37598: Calling all_inventory to load vars for managed-node2 46400 1727204616.37601: Calling groups_inventory to load vars for managed-node2 46400 1727204616.37607: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204616.37614: Calling all_plugins_play to load vars for managed-node2 46400 1727204616.37617: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204616.37620: Calling groups_plugins_play to load vars for managed-node2 46400 1727204616.40357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.42097: done with get_vars() 46400 1727204616.42129: done getting variables 46400 1727204616.42218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:36 -0400 (0:00:01.189) 0:01:46.707 ***** 46400 1727204616.42254: entering _queue_task() for managed-node2/debug 46400 1727204616.42970: worker is 1 (out of 1 available) 46400 1727204616.42982: exiting _queue_task() for managed-node2/debug 46400 1727204616.42995: done queuing things up, now waiting for results queue to drain 46400 1727204616.42996: waiting for pending results... 46400 1727204616.43992: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204616.44307: in run() - task 0affcd87-79f5-1303-fda8-0000000021a4 46400 1727204616.44331: variable 'ansible_search_path' from source: unknown 46400 1727204616.44350: variable 'ansible_search_path' from source: unknown 46400 1727204616.44401: calling self._execute() 46400 1727204616.44522: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.44535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.44551: variable 'omit' from source: magic vars 46400 1727204616.45044: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.45062: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204616.45077: variable 'omit' from source: magic vars 46400 1727204616.45160: variable 'omit' from source: magic vars 46400 1727204616.45278: variable 'network_provider' from source: set_fact 46400 1727204616.45303: variable 'omit' from source: magic vars 46400 1727204616.45360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204616.45414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204616.45445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204616.45481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204616.45505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204616.45543: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204616.45552: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.45565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.45689: Set connection var ansible_shell_type to sh 46400 1727204616.45705: Set connection var ansible_shell_executable to /bin/sh 46400 1727204616.45716: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204616.45726: Set connection var ansible_connection to ssh 46400 1727204616.45767: Set connection var ansible_pipelining to False 46400 1727204616.45795: Set connection var ansible_timeout to 10 46400 1727204616.45908: variable 'ansible_shell_executable' from source: unknown 46400 1727204616.45917: variable 'ansible_connection' from source: unknown 46400 1727204616.45924: variable 'ansible_module_compression' from source: unknown 46400 1727204616.45931: variable 'ansible_shell_type' from source: unknown 46400 1727204616.45938: variable 'ansible_shell_executable' from source: unknown 46400 1727204616.45946: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.45953: variable 'ansible_pipelining' from source: unknown 46400 1727204616.45960: variable 'ansible_timeout' from source: unknown 46400 1727204616.45970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.46135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204616.46151: variable 'omit' from source: magic vars 46400 1727204616.46161: starting attempt loop 46400 1727204616.46170: running the handler 46400 1727204616.46223: handler run complete 46400 1727204616.46242: attempt loop complete, returning result 46400 1727204616.46248: _execute() done 46400 1727204616.46254: dumping result to json 46400 1727204616.46260: done dumping result, returning 46400 1727204616.46274: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-0000000021a4] 46400 1727204616.46283: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a4 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204616.46458: no more pending results, returning what we have 46400 1727204616.46462: results queue empty 46400 1727204616.46465: checking for any_errors_fatal 46400 1727204616.46478: done checking for any_errors_fatal 46400 1727204616.46479: checking for max_fail_percentage 46400 1727204616.46481: done checking for max_fail_percentage 46400 1727204616.46482: checking to see if all hosts have failed and the running result is not ok 46400 1727204616.46483: done checking to see if all hosts have failed 46400 1727204616.46484: getting the remaining hosts for this loop 46400 1727204616.46486: done getting the remaining hosts for this loop 46400 1727204616.46490: getting the next task for host managed-node2 46400 1727204616.46500: done getting next task for host managed-node2 46400 1727204616.46505: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204616.46512: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204616.46528: getting variables 46400 1727204616.46530: in VariableManager get_vars() 46400 1727204616.46581: Calling all_inventory to load vars for managed-node2 46400 1727204616.46584: Calling groups_inventory to load vars for managed-node2 46400 1727204616.46587: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204616.46598: Calling all_plugins_play to load vars for managed-node2 46400 1727204616.46601: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204616.46604: Calling groups_plugins_play to load vars for managed-node2 46400 1727204616.48674: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a4 46400 1727204616.48678: WORKER PROCESS EXITING 46400 1727204616.50288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.54017: done with get_vars() 46400 1727204616.54057: done getting variables 46400 1727204616.54247: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.123) 0:01:46.831 ***** 46400 1727204616.54654: entering _queue_task() for managed-node2/fail 46400 1727204616.55359: worker is 1 (out of 1 available) 46400 1727204616.55488: exiting _queue_task() for managed-node2/fail 46400 1727204616.55500: done queuing things up, now waiting for results queue to drain 46400 1727204616.55502: waiting for pending results... 46400 1727204616.56328: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204616.56771: in run() - task 0affcd87-79f5-1303-fda8-0000000021a5 46400 1727204616.56922: variable 'ansible_search_path' from source: unknown 46400 1727204616.56931: variable 'ansible_search_path' from source: unknown 46400 1727204616.56976: calling self._execute() 46400 1727204616.57186: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.57199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.57214: variable 'omit' from source: magic vars 46400 1727204616.58124: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.58144: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204616.58392: variable 'network_state' from source: role '' defaults 46400 1727204616.58410: Evaluated conditional (network_state != {}): False 46400 1727204616.58420: when evaluation is False, skipping this task 46400 1727204616.58438: _execute() done 46400 1727204616.58543: dumping result to json 46400 1727204616.58553: done dumping result, returning 46400 1727204616.58566: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-0000000021a5] 46400 1727204616.58578: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a5 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204616.58734: no more pending results, returning what we have 46400 1727204616.58739: results queue empty 46400 1727204616.58740: checking for any_errors_fatal 46400 1727204616.58750: done checking for any_errors_fatal 46400 1727204616.58751: checking for max_fail_percentage 46400 1727204616.58752: done checking for max_fail_percentage 46400 1727204616.58753: checking to see if all hosts have failed and the running result is not ok 46400 1727204616.58754: done checking to see if all hosts have failed 46400 1727204616.58755: getting the remaining hosts for this loop 46400 1727204616.58757: done getting the remaining hosts for this loop 46400 1727204616.58761: getting the next task for host managed-node2 46400 1727204616.58775: done getting next task for host managed-node2 46400 1727204616.58780: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204616.58788: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204616.58813: getting variables 46400 1727204616.58815: in VariableManager get_vars() 46400 1727204616.58872: Calling all_inventory to load vars for managed-node2 46400 1727204616.58875: Calling groups_inventory to load vars for managed-node2 46400 1727204616.58878: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204616.58891: Calling all_plugins_play to load vars for managed-node2 46400 1727204616.58894: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204616.58897: Calling groups_plugins_play to load vars for managed-node2 46400 1727204616.60373: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a5 46400 1727204616.60377: WORKER PROCESS EXITING 46400 1727204616.62221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.65534: done with get_vars() 46400 1727204616.65781: done getting variables 46400 1727204616.65846: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.112) 0:01:46.943 ***** 46400 1727204616.65887: entering _queue_task() for managed-node2/fail 46400 1727204616.66648: worker is 1 (out of 1 available) 46400 1727204616.66662: exiting _queue_task() for managed-node2/fail 46400 1727204616.66876: done queuing things up, now waiting for results queue to drain 46400 1727204616.66878: waiting for pending results... 46400 1727204616.67914: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204616.68487: in run() - task 0affcd87-79f5-1303-fda8-0000000021a6 46400 1727204616.68532: variable 'ansible_search_path' from source: unknown 46400 1727204616.68540: variable 'ansible_search_path' from source: unknown 46400 1727204616.68598: calling self._execute() 46400 1727204616.68830: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.68931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.68945: variable 'omit' from source: magic vars 46400 1727204616.69855: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.69897: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204616.70185: variable 'network_state' from source: role '' defaults 46400 1727204616.70257: Evaluated conditional (network_state != {}): False 46400 1727204616.70346: when evaluation is False, skipping this task 46400 1727204616.70358: _execute() done 46400 1727204616.70371: dumping result to json 46400 1727204616.70380: done dumping result, returning 46400 1727204616.70386: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-0000000021a6] 46400 1727204616.70396: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a6 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204616.70555: no more pending results, returning what we have 46400 1727204616.70560: results queue empty 46400 1727204616.70562: checking for any_errors_fatal 46400 1727204616.70573: done checking for any_errors_fatal 46400 1727204616.70574: checking for max_fail_percentage 46400 1727204616.70576: done checking for max_fail_percentage 46400 1727204616.70577: checking to see if all hosts have failed and the running result is not ok 46400 1727204616.70578: done checking to see if all hosts have failed 46400 1727204616.70578: getting the remaining hosts for this loop 46400 1727204616.70580: done getting the remaining hosts for this loop 46400 1727204616.70584: getting the next task for host managed-node2 46400 1727204616.70594: done getting next task for host managed-node2 46400 1727204616.70598: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204616.70605: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204616.70628: getting variables 46400 1727204616.70630: in VariableManager get_vars() 46400 1727204616.70686: Calling all_inventory to load vars for managed-node2 46400 1727204616.70690: Calling groups_inventory to load vars for managed-node2 46400 1727204616.70693: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204616.70706: Calling all_plugins_play to load vars for managed-node2 46400 1727204616.70709: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204616.70713: Calling groups_plugins_play to load vars for managed-node2 46400 1727204616.71366: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a6 46400 1727204616.71371: WORKER PROCESS EXITING 46400 1727204616.74035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.76171: done with get_vars() 46400 1727204616.76214: done getting variables 46400 1727204616.76279: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.104) 0:01:47.047 ***** 46400 1727204616.76329: entering _queue_task() for managed-node2/fail 46400 1727204616.76820: worker is 1 (out of 1 available) 46400 1727204616.76834: exiting _queue_task() for managed-node2/fail 46400 1727204616.76968: done queuing things up, now waiting for results queue to drain 46400 1727204616.76970: waiting for pending results... 46400 1727204616.78112: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204616.78500: in run() - task 0affcd87-79f5-1303-fda8-0000000021a7 46400 1727204616.78526: variable 'ansible_search_path' from source: unknown 46400 1727204616.78539: variable 'ansible_search_path' from source: unknown 46400 1727204616.78585: calling self._execute() 46400 1727204616.78740: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.78872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.78888: variable 'omit' from source: magic vars 46400 1727204616.79534: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.79553: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204616.79789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204616.85957: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204616.86100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204616.86149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204616.86197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204616.86243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204616.86337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204616.86689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204616.86724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204616.86783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204616.86802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204616.86931: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.87099: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204616.87106: when evaluation is False, skipping this task 46400 1727204616.87113: _execute() done 46400 1727204616.87119: dumping result to json 46400 1727204616.87126: done dumping result, returning 46400 1727204616.87137: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-0000000021a7] 46400 1727204616.87146: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a7 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204616.87451: no more pending results, returning what we have 46400 1727204616.87456: results queue empty 46400 1727204616.87457: checking for any_errors_fatal 46400 1727204616.87462: done checking for any_errors_fatal 46400 1727204616.87463: checking for max_fail_percentage 46400 1727204616.87466: done checking for max_fail_percentage 46400 1727204616.87467: checking to see if all hosts have failed and the running result is not ok 46400 1727204616.87468: done checking to see if all hosts have failed 46400 1727204616.87469: getting the remaining hosts for this loop 46400 1727204616.87471: done getting the remaining hosts for this loop 46400 1727204616.87475: getting the next task for host managed-node2 46400 1727204616.87485: done getting next task for host managed-node2 46400 1727204616.87489: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204616.87495: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204616.87514: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a7 46400 1727204616.87518: WORKER PROCESS EXITING 46400 1727204616.87530: getting variables 46400 1727204616.87532: in VariableManager get_vars() 46400 1727204616.87582: Calling all_inventory to load vars for managed-node2 46400 1727204616.87585: Calling groups_inventory to load vars for managed-node2 46400 1727204616.87587: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204616.87598: Calling all_plugins_play to load vars for managed-node2 46400 1727204616.87601: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204616.87604: Calling groups_plugins_play to load vars for managed-node2 46400 1727204616.90567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204616.94583: done with get_vars() 46400 1727204616.94640: done getting variables 46400 1727204616.94831: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.185) 0:01:47.233 ***** 46400 1727204616.94885: entering _queue_task() for managed-node2/dnf 46400 1727204616.95312: worker is 1 (out of 1 available) 46400 1727204616.95326: exiting _queue_task() for managed-node2/dnf 46400 1727204616.95344: done queuing things up, now waiting for results queue to drain 46400 1727204616.95345: waiting for pending results... 46400 1727204616.95675: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204616.95845: in run() - task 0affcd87-79f5-1303-fda8-0000000021a8 46400 1727204616.95872: variable 'ansible_search_path' from source: unknown 46400 1727204616.95884: variable 'ansible_search_path' from source: unknown 46400 1727204616.95932: calling self._execute() 46400 1727204616.96049: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204616.96067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204616.96084: variable 'omit' from source: magic vars 46400 1727204616.96527: variable 'ansible_distribution_major_version' from source: facts 46400 1727204616.96551: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204616.96886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.00125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.00211: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.00262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.00308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.00425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.00514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.00571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.00603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.00658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.00684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.00827: variable 'ansible_distribution' from source: facts 46400 1727204617.00955: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.00986: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204617.01213: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.01377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.01411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.01439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.01508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.01528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.01577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.01609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.01640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.01689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.01714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.01767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.01795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.01832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.01883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.01904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.02097: variable 'network_connections' from source: include params 46400 1727204617.02116: variable 'interface' from source: play vars 46400 1727204617.02200: variable 'interface' from source: play vars 46400 1727204617.02296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204617.02516: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204617.02561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204617.02628: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204617.02668: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204617.02734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204617.02768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204617.02817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.02849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204617.02931: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204617.03250: variable 'network_connections' from source: include params 46400 1727204617.03268: variable 'interface' from source: play vars 46400 1727204617.03355: variable 'interface' from source: play vars 46400 1727204617.03403: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204617.03411: when evaluation is False, skipping this task 46400 1727204617.03424: _execute() done 46400 1727204617.03436: dumping result to json 46400 1727204617.03446: done dumping result, returning 46400 1727204617.03468: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000021a8] 46400 1727204617.03480: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a8 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204617.03656: no more pending results, returning what we have 46400 1727204617.03663: results queue empty 46400 1727204617.03666: checking for any_errors_fatal 46400 1727204617.03674: done checking for any_errors_fatal 46400 1727204617.03677: checking for max_fail_percentage 46400 1727204617.03679: done checking for max_fail_percentage 46400 1727204617.03680: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.03681: done checking to see if all hosts have failed 46400 1727204617.03682: getting the remaining hosts for this loop 46400 1727204617.03684: done getting the remaining hosts for this loop 46400 1727204617.03689: getting the next task for host managed-node2 46400 1727204617.03699: done getting next task for host managed-node2 46400 1727204617.03704: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204617.03711: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.03735: getting variables 46400 1727204617.03736: in VariableManager get_vars() 46400 1727204617.03796: Calling all_inventory to load vars for managed-node2 46400 1727204617.03799: Calling groups_inventory to load vars for managed-node2 46400 1727204617.03802: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.03820: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.03826: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.03833: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.04805: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a8 46400 1727204617.04809: WORKER PROCESS EXITING 46400 1727204617.05954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.08481: done with get_vars() 46400 1727204617.08508: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204617.08705: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.138) 0:01:47.371 ***** 46400 1727204617.08815: entering _queue_task() for managed-node2/yum 46400 1727204617.09211: worker is 1 (out of 1 available) 46400 1727204617.09223: exiting _queue_task() for managed-node2/yum 46400 1727204617.09236: done queuing things up, now waiting for results queue to drain 46400 1727204617.09239: waiting for pending results... 46400 1727204617.09556: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204617.09741: in run() - task 0affcd87-79f5-1303-fda8-0000000021a9 46400 1727204617.09766: variable 'ansible_search_path' from source: unknown 46400 1727204617.09777: variable 'ansible_search_path' from source: unknown 46400 1727204617.09821: calling self._execute() 46400 1727204617.09939: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.09952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.09973: variable 'omit' from source: magic vars 46400 1727204617.10401: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.10411: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.10540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.13281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.13368: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.13426: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.13481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.13526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.13621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.13679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.13710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.13766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.13787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.13908: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.13931: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204617.13944: when evaluation is False, skipping this task 46400 1727204617.13950: _execute() done 46400 1727204617.13957: dumping result to json 46400 1727204617.13974: done dumping result, returning 46400 1727204617.13986: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000021a9] 46400 1727204617.13996: sending task result for task 0affcd87-79f5-1303-fda8-0000000021a9 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204617.14176: no more pending results, returning what we have 46400 1727204617.14182: results queue empty 46400 1727204617.14183: checking for any_errors_fatal 46400 1727204617.14192: done checking for any_errors_fatal 46400 1727204617.14193: checking for max_fail_percentage 46400 1727204617.14195: done checking for max_fail_percentage 46400 1727204617.14196: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.14197: done checking to see if all hosts have failed 46400 1727204617.14198: getting the remaining hosts for this loop 46400 1727204617.14200: done getting the remaining hosts for this loop 46400 1727204617.14205: getting the next task for host managed-node2 46400 1727204617.14215: done getting next task for host managed-node2 46400 1727204617.14220: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204617.14228: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.14253: getting variables 46400 1727204617.14256: in VariableManager get_vars() 46400 1727204617.14313: Calling all_inventory to load vars for managed-node2 46400 1727204617.14317: Calling groups_inventory to load vars for managed-node2 46400 1727204617.14319: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.14331: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.14335: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.14338: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.15470: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021a9 46400 1727204617.15474: WORKER PROCESS EXITING 46400 1727204617.17249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.19244: done with get_vars() 46400 1727204617.19296: done getting variables 46400 1727204617.19366: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.106) 0:01:47.478 ***** 46400 1727204617.19410: entering _queue_task() for managed-node2/fail 46400 1727204617.19799: worker is 1 (out of 1 available) 46400 1727204617.19812: exiting _queue_task() for managed-node2/fail 46400 1727204617.19824: done queuing things up, now waiting for results queue to drain 46400 1727204617.19826: waiting for pending results... 46400 1727204617.20143: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204617.20315: in run() - task 0affcd87-79f5-1303-fda8-0000000021aa 46400 1727204617.20402: variable 'ansible_search_path' from source: unknown 46400 1727204617.20410: variable 'ansible_search_path' from source: unknown 46400 1727204617.20749: calling self._execute() 46400 1727204617.20853: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.20875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.20890: variable 'omit' from source: magic vars 46400 1727204617.21376: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.21400: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.21551: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.21784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.24449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.24532: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.24584: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.24622: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.24667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.24754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.25273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.25314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.25362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.25385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.25443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.25480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.25518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.25569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.25590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.25645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.25677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.25706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.25761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.25784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.25986: variable 'network_connections' from source: include params 46400 1727204617.26001: variable 'interface' from source: play vars 46400 1727204617.26088: variable 'interface' from source: play vars 46400 1727204617.26177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204617.26348: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204617.26402: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204617.26433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204617.26468: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204617.26522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204617.26548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204617.26582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.26620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204617.26689: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204617.26977: variable 'network_connections' from source: include params 46400 1727204617.26988: variable 'interface' from source: play vars 46400 1727204617.27070: variable 'interface' from source: play vars 46400 1727204617.27107: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204617.27116: when evaluation is False, skipping this task 46400 1727204617.27123: _execute() done 46400 1727204617.27130: dumping result to json 46400 1727204617.27143: done dumping result, returning 46400 1727204617.27162: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000021aa] 46400 1727204617.27176: sending task result for task 0affcd87-79f5-1303-fda8-0000000021aa skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204617.27351: no more pending results, returning what we have 46400 1727204617.27356: results queue empty 46400 1727204617.27357: checking for any_errors_fatal 46400 1727204617.27369: done checking for any_errors_fatal 46400 1727204617.27370: checking for max_fail_percentage 46400 1727204617.27372: done checking for max_fail_percentage 46400 1727204617.27373: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.27374: done checking to see if all hosts have failed 46400 1727204617.27376: getting the remaining hosts for this loop 46400 1727204617.27378: done getting the remaining hosts for this loop 46400 1727204617.27382: getting the next task for host managed-node2 46400 1727204617.27392: done getting next task for host managed-node2 46400 1727204617.27396: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204617.27402: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.27424: getting variables 46400 1727204617.27426: in VariableManager get_vars() 46400 1727204617.27483: Calling all_inventory to load vars for managed-node2 46400 1727204617.27487: Calling groups_inventory to load vars for managed-node2 46400 1727204617.27489: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.27501: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.27504: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.27507: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.28567: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021aa 46400 1727204617.28571: WORKER PROCESS EXITING 46400 1727204617.29718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.31422: done with get_vars() 46400 1727204617.31453: done getting variables 46400 1727204617.31526: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.121) 0:01:47.600 ***** 46400 1727204617.31576: entering _queue_task() for managed-node2/package 46400 1727204617.31957: worker is 1 (out of 1 available) 46400 1727204617.31977: exiting _queue_task() for managed-node2/package 46400 1727204617.31992: done queuing things up, now waiting for results queue to drain 46400 1727204617.31994: waiting for pending results... 46400 1727204617.32311: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204617.32503: in run() - task 0affcd87-79f5-1303-fda8-0000000021ab 46400 1727204617.32524: variable 'ansible_search_path' from source: unknown 46400 1727204617.32533: variable 'ansible_search_path' from source: unknown 46400 1727204617.32589: calling self._execute() 46400 1727204617.32702: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.32714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.32728: variable 'omit' from source: magic vars 46400 1727204617.33147: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.33166: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.33382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204617.33683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204617.33737: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204617.33788: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204617.33899: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204617.34033: variable 'network_packages' from source: role '' defaults 46400 1727204617.34162: variable '__network_provider_setup' from source: role '' defaults 46400 1727204617.34185: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204617.34262: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204617.34279: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204617.34353: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204617.34576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.36907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.36993: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.37042: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.37090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.37128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.37220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.37257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.37299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.37350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.37378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.37430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.37469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.37504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.37553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.37580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.37865: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204617.38000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.38028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.38065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.38113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.38131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.38240: variable 'ansible_python' from source: facts 46400 1727204617.38274: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204617.38367: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204617.38451: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204617.38587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.38616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.38651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.38699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.38716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.38770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.38813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.38847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.38905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.38930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.39105: variable 'network_connections' from source: include params 46400 1727204617.39117: variable 'interface' from source: play vars 46400 1727204617.39236: variable 'interface' from source: play vars 46400 1727204617.39351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204617.39391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204617.39431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.39477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204617.39537: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.39867: variable 'network_connections' from source: include params 46400 1727204617.39879: variable 'interface' from source: play vars 46400 1727204617.39994: variable 'interface' from source: play vars 46400 1727204617.40067: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204617.40158: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.40515: variable 'network_connections' from source: include params 46400 1727204617.40524: variable 'interface' from source: play vars 46400 1727204617.40602: variable 'interface' from source: play vars 46400 1727204617.40633: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204617.40727: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204617.41054: variable 'network_connections' from source: include params 46400 1727204617.41069: variable 'interface' from source: play vars 46400 1727204617.41134: variable 'interface' from source: play vars 46400 1727204617.41205: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204617.41277: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204617.41290: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204617.41352: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204617.41595: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204617.42145: variable 'network_connections' from source: include params 46400 1727204617.42156: variable 'interface' from source: play vars 46400 1727204617.42227: variable 'interface' from source: play vars 46400 1727204617.42245: variable 'ansible_distribution' from source: facts 46400 1727204617.42255: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.42271: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.42310: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204617.42497: variable 'ansible_distribution' from source: facts 46400 1727204617.42506: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.42518: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.42534: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204617.42719: variable 'ansible_distribution' from source: facts 46400 1727204617.42728: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.42741: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.42792: variable 'network_provider' from source: set_fact 46400 1727204617.42815: variable 'ansible_facts' from source: unknown 46400 1727204617.43717: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204617.43726: when evaluation is False, skipping this task 46400 1727204617.43733: _execute() done 46400 1727204617.43740: dumping result to json 46400 1727204617.43747: done dumping result, returning 46400 1727204617.43763: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-0000000021ab] 46400 1727204617.43780: sending task result for task 0affcd87-79f5-1303-fda8-0000000021ab skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204617.43931: no more pending results, returning what we have 46400 1727204617.43937: results queue empty 46400 1727204617.43938: checking for any_errors_fatal 46400 1727204617.43947: done checking for any_errors_fatal 46400 1727204617.43948: checking for max_fail_percentage 46400 1727204617.43950: done checking for max_fail_percentage 46400 1727204617.43952: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.43952: done checking to see if all hosts have failed 46400 1727204617.43953: getting the remaining hosts for this loop 46400 1727204617.43955: done getting the remaining hosts for this loop 46400 1727204617.43962: getting the next task for host managed-node2 46400 1727204617.43974: done getting next task for host managed-node2 46400 1727204617.43980: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204617.43986: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.44008: getting variables 46400 1727204617.44010: in VariableManager get_vars() 46400 1727204617.44058: Calling all_inventory to load vars for managed-node2 46400 1727204617.44065: Calling groups_inventory to load vars for managed-node2 46400 1727204617.44073: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.44084: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.44086: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.44089: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.45133: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021ab 46400 1727204617.45137: WORKER PROCESS EXITING 46400 1727204617.46005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.47829: done with get_vars() 46400 1727204617.47879: done getting variables 46400 1727204617.47952: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.164) 0:01:47.764 ***** 46400 1727204617.47997: entering _queue_task() for managed-node2/package 46400 1727204617.48398: worker is 1 (out of 1 available) 46400 1727204617.48411: exiting _queue_task() for managed-node2/package 46400 1727204617.48424: done queuing things up, now waiting for results queue to drain 46400 1727204617.48425: waiting for pending results... 46400 1727204617.48756: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204617.48930: in run() - task 0affcd87-79f5-1303-fda8-0000000021ac 46400 1727204617.48955: variable 'ansible_search_path' from source: unknown 46400 1727204617.48968: variable 'ansible_search_path' from source: unknown 46400 1727204617.49015: calling self._execute() 46400 1727204617.49134: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.49145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.49167: variable 'omit' from source: magic vars 46400 1727204617.49578: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.49602: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.49730: variable 'network_state' from source: role '' defaults 46400 1727204617.49751: Evaluated conditional (network_state != {}): False 46400 1727204617.49761: when evaluation is False, skipping this task 46400 1727204617.49770: _execute() done 46400 1727204617.49777: dumping result to json 46400 1727204617.49784: done dumping result, returning 46400 1727204617.49793: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000021ac] 46400 1727204617.49809: sending task result for task 0affcd87-79f5-1303-fda8-0000000021ac 46400 1727204617.49937: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021ac 46400 1727204617.49944: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204617.50003: no more pending results, returning what we have 46400 1727204617.50008: results queue empty 46400 1727204617.50009: checking for any_errors_fatal 46400 1727204617.50017: done checking for any_errors_fatal 46400 1727204617.50017: checking for max_fail_percentage 46400 1727204617.50020: done checking for max_fail_percentage 46400 1727204617.50021: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.50022: done checking to see if all hosts have failed 46400 1727204617.50022: getting the remaining hosts for this loop 46400 1727204617.50024: done getting the remaining hosts for this loop 46400 1727204617.50028: getting the next task for host managed-node2 46400 1727204617.50041: done getting next task for host managed-node2 46400 1727204617.50045: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204617.50052: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.50084: getting variables 46400 1727204617.50086: in VariableManager get_vars() 46400 1727204617.50138: Calling all_inventory to load vars for managed-node2 46400 1727204617.50141: Calling groups_inventory to load vars for managed-node2 46400 1727204617.50144: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.50158: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.50165: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.50169: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.52310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.54081: done with get_vars() 46400 1727204617.54120: done getting variables 46400 1727204617.54199: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.062) 0:01:47.826 ***** 46400 1727204617.54239: entering _queue_task() for managed-node2/package 46400 1727204617.54627: worker is 1 (out of 1 available) 46400 1727204617.54641: exiting _queue_task() for managed-node2/package 46400 1727204617.54654: done queuing things up, now waiting for results queue to drain 46400 1727204617.54656: waiting for pending results... 46400 1727204617.54997: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204617.55183: in run() - task 0affcd87-79f5-1303-fda8-0000000021ad 46400 1727204617.55206: variable 'ansible_search_path' from source: unknown 46400 1727204617.55218: variable 'ansible_search_path' from source: unknown 46400 1727204617.55270: calling self._execute() 46400 1727204617.55381: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.55393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.55406: variable 'omit' from source: magic vars 46400 1727204617.55818: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.55835: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.55969: variable 'network_state' from source: role '' defaults 46400 1727204617.55985: Evaluated conditional (network_state != {}): False 46400 1727204617.55994: when evaluation is False, skipping this task 46400 1727204617.56000: _execute() done 46400 1727204617.56006: dumping result to json 46400 1727204617.56011: done dumping result, returning 46400 1727204617.56020: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000021ad] 46400 1727204617.56035: sending task result for task 0affcd87-79f5-1303-fda8-0000000021ad skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204617.56223: no more pending results, returning what we have 46400 1727204617.56228: results queue empty 46400 1727204617.56229: checking for any_errors_fatal 46400 1727204617.56239: done checking for any_errors_fatal 46400 1727204617.56240: checking for max_fail_percentage 46400 1727204617.56242: done checking for max_fail_percentage 46400 1727204617.56243: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.56244: done checking to see if all hosts have failed 46400 1727204617.56245: getting the remaining hosts for this loop 46400 1727204617.56247: done getting the remaining hosts for this loop 46400 1727204617.56252: getting the next task for host managed-node2 46400 1727204617.56267: done getting next task for host managed-node2 46400 1727204617.56272: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204617.56279: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.56305: getting variables 46400 1727204617.56307: in VariableManager get_vars() 46400 1727204617.56369: Calling all_inventory to load vars for managed-node2 46400 1727204617.56373: Calling groups_inventory to load vars for managed-node2 46400 1727204617.56376: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.56390: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.56394: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.56397: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.57352: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021ad 46400 1727204617.57355: WORKER PROCESS EXITING 46400 1727204617.58356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.60337: done with get_vars() 46400 1727204617.60369: done getting variables 46400 1727204617.60442: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.062) 0:01:47.889 ***** 46400 1727204617.60489: entering _queue_task() for managed-node2/service 46400 1727204617.60880: worker is 1 (out of 1 available) 46400 1727204617.60895: exiting _queue_task() for managed-node2/service 46400 1727204617.60909: done queuing things up, now waiting for results queue to drain 46400 1727204617.60911: waiting for pending results... 46400 1727204617.61247: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204617.61439: in run() - task 0affcd87-79f5-1303-fda8-0000000021ae 46400 1727204617.61462: variable 'ansible_search_path' from source: unknown 46400 1727204617.61475: variable 'ansible_search_path' from source: unknown 46400 1727204617.61524: calling self._execute() 46400 1727204617.61640: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.61653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.61673: variable 'omit' from source: magic vars 46400 1727204617.62086: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.62102: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.62238: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.62445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.65031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.65114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.65156: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.65206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.65238: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.65330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.65378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.65415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.65471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.65493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.65551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.65585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.65619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.65676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.65696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.65750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.65784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.65814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.65870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.65891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.66101: variable 'network_connections' from source: include params 46400 1727204617.66119: variable 'interface' from source: play vars 46400 1727204617.66204: variable 'interface' from source: play vars 46400 1727204617.66296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204617.66480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204617.66529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204617.66569: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204617.66610: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204617.66657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204617.66691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204617.66728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.66765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204617.66845: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204617.67122: variable 'network_connections' from source: include params 46400 1727204617.67132: variable 'interface' from source: play vars 46400 1727204617.67209: variable 'interface' from source: play vars 46400 1727204617.67246: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204617.67265: when evaluation is False, skipping this task 46400 1727204617.67274: _execute() done 46400 1727204617.67281: dumping result to json 46400 1727204617.67288: done dumping result, returning 46400 1727204617.67298: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000021ae] 46400 1727204617.67307: sending task result for task 0affcd87-79f5-1303-fda8-0000000021ae skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204617.67476: no more pending results, returning what we have 46400 1727204617.67480: results queue empty 46400 1727204617.67481: checking for any_errors_fatal 46400 1727204617.67490: done checking for any_errors_fatal 46400 1727204617.67491: checking for max_fail_percentage 46400 1727204617.67493: done checking for max_fail_percentage 46400 1727204617.67494: checking to see if all hosts have failed and the running result is not ok 46400 1727204617.67495: done checking to see if all hosts have failed 46400 1727204617.67495: getting the remaining hosts for this loop 46400 1727204617.67497: done getting the remaining hosts for this loop 46400 1727204617.67503: getting the next task for host managed-node2 46400 1727204617.67512: done getting next task for host managed-node2 46400 1727204617.67518: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204617.67523: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204617.67546: getting variables 46400 1727204617.67548: in VariableManager get_vars() 46400 1727204617.67607: Calling all_inventory to load vars for managed-node2 46400 1727204617.67610: Calling groups_inventory to load vars for managed-node2 46400 1727204617.67613: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204617.67623: Calling all_plugins_play to load vars for managed-node2 46400 1727204617.67626: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204617.67628: Calling groups_plugins_play to load vars for managed-node2 46400 1727204617.68657: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021ae 46400 1727204617.68665: WORKER PROCESS EXITING 46400 1727204617.69636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204617.71410: done with get_vars() 46400 1727204617.71447: done getting variables 46400 1727204617.71521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.110) 0:01:48.000 ***** 46400 1727204617.71557: entering _queue_task() for managed-node2/service 46400 1727204617.71947: worker is 1 (out of 1 available) 46400 1727204617.71968: exiting _queue_task() for managed-node2/service 46400 1727204617.71982: done queuing things up, now waiting for results queue to drain 46400 1727204617.71983: waiting for pending results... 46400 1727204617.72302: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204617.72466: in run() - task 0affcd87-79f5-1303-fda8-0000000021af 46400 1727204617.72490: variable 'ansible_search_path' from source: unknown 46400 1727204617.72499: variable 'ansible_search_path' from source: unknown 46400 1727204617.72541: calling self._execute() 46400 1727204617.72665: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.72682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.72696: variable 'omit' from source: magic vars 46400 1727204617.73095: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.73120: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204617.73314: variable 'network_provider' from source: set_fact 46400 1727204617.73331: variable 'network_state' from source: role '' defaults 46400 1727204617.73346: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204617.73357: variable 'omit' from source: magic vars 46400 1727204617.73440: variable 'omit' from source: magic vars 46400 1727204617.73479: variable 'network_service_name' from source: role '' defaults 46400 1727204617.73562: variable 'network_service_name' from source: role '' defaults 46400 1727204617.73685: variable '__network_provider_setup' from source: role '' defaults 46400 1727204617.73696: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204617.73776: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204617.73790: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204617.73865: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204617.74122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204617.76653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204617.76739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204617.76793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204617.76852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204617.76896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204617.76987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.77033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.77070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.77123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.77148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.77202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.77236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.77274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.77324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.77344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.77609: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204617.77732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.77771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.77806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.77843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.77862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.77962: variable 'ansible_python' from source: facts 46400 1727204617.77989: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204617.78084: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204617.78170: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204617.78322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.78357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.78394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.78450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.78475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.78533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204617.78579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204617.78609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.78666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204617.78687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204617.78850: variable 'network_connections' from source: include params 46400 1727204617.78869: variable 'interface' from source: play vars 46400 1727204617.78952: variable 'interface' from source: play vars 46400 1727204617.79090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204617.79308: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204617.79373: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204617.79430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204617.79482: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204617.80083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204617.80120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204617.80157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204617.80208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204617.80268: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.80577: variable 'network_connections' from source: include params 46400 1727204617.80591: variable 'interface' from source: play vars 46400 1727204617.80681: variable 'interface' from source: play vars 46400 1727204617.80741: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204617.80836: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204617.81154: variable 'network_connections' from source: include params 46400 1727204617.81175: variable 'interface' from source: play vars 46400 1727204617.81252: variable 'interface' from source: play vars 46400 1727204617.81294: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204617.81387: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204617.81721: variable 'network_connections' from source: include params 46400 1727204617.81731: variable 'interface' from source: play vars 46400 1727204617.81813: variable 'interface' from source: play vars 46400 1727204617.81890: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204617.81966: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204617.81980: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204617.82049: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204617.82292: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204617.82846: variable 'network_connections' from source: include params 46400 1727204617.82856: variable 'interface' from source: play vars 46400 1727204617.82931: variable 'interface' from source: play vars 46400 1727204617.82944: variable 'ansible_distribution' from source: facts 46400 1727204617.82952: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.82968: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.83006: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204617.83182: variable 'ansible_distribution' from source: facts 46400 1727204617.83192: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.83202: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.83214: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204617.83391: variable 'ansible_distribution' from source: facts 46400 1727204617.83399: variable '__network_rh_distros' from source: role '' defaults 46400 1727204617.83408: variable 'ansible_distribution_major_version' from source: facts 46400 1727204617.83458: variable 'network_provider' from source: set_fact 46400 1727204617.83491: variable 'omit' from source: magic vars 46400 1727204617.83521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204617.83554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204617.83585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204617.83606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204617.83622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204617.83670: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204617.83681: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.83690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.83800: Set connection var ansible_shell_type to sh 46400 1727204617.83816: Set connection var ansible_shell_executable to /bin/sh 46400 1727204617.83826: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204617.83836: Set connection var ansible_connection to ssh 46400 1727204617.83844: Set connection var ansible_pipelining to False 46400 1727204617.83854: Set connection var ansible_timeout to 10 46400 1727204617.83899: variable 'ansible_shell_executable' from source: unknown 46400 1727204617.83907: variable 'ansible_connection' from source: unknown 46400 1727204617.83914: variable 'ansible_module_compression' from source: unknown 46400 1727204617.83920: variable 'ansible_shell_type' from source: unknown 46400 1727204617.83926: variable 'ansible_shell_executable' from source: unknown 46400 1727204617.83933: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204617.83940: variable 'ansible_pipelining' from source: unknown 46400 1727204617.83946: variable 'ansible_timeout' from source: unknown 46400 1727204617.83954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204617.84077: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204617.84108: variable 'omit' from source: magic vars 46400 1727204617.84119: starting attempt loop 46400 1727204617.84126: running the handler 46400 1727204617.84221: variable 'ansible_facts' from source: unknown 46400 1727204617.85093: _low_level_execute_command(): starting 46400 1727204617.85106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204617.85901: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204617.85916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.85929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.85951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.86001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.86013: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204617.86027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.86050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204617.86066: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204617.86079: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204617.86091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.86106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.86122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.86135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.86145: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204617.86167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.86236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204617.86252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204617.86276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204617.86427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204617.88086: stdout chunk (state=3): >>>/root <<< 46400 1727204617.88282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204617.88286: stdout chunk (state=3): >>><<< 46400 1727204617.88296: stderr chunk (state=3): >>><<< 46400 1727204617.88319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204617.88335: _low_level_execute_command(): starting 46400 1727204617.88340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052 `" && echo ansible-tmp-1727204617.883197-53672-11352705264052="` echo /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052 `" ) && sleep 0' 46400 1727204617.89054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204617.89071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.89084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.89098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.89193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.89197: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204617.89199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.89201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204617.89204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204617.89206: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204617.89208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.89384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.89501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204617.89507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.89584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204617.89599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204617.89609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204617.89680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204617.91575: stdout chunk (state=3): >>>ansible-tmp-1727204617.883197-53672-11352705264052=/root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052 <<< 46400 1727204617.91759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204617.91763: stderr chunk (state=3): >>><<< 46400 1727204617.91767: stdout chunk (state=3): >>><<< 46400 1727204617.91784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204617.883197-53672-11352705264052=/root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204617.91817: variable 'ansible_module_compression' from source: unknown 46400 1727204617.91872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204617.91936: variable 'ansible_facts' from source: unknown 46400 1727204617.92127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/AnsiballZ_systemd.py 46400 1727204617.92295: Sending initial data 46400 1727204617.92298: Sent initial data (154 bytes) 46400 1727204617.93346: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204617.93350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.93352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.93355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.93562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.93567: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204617.93569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.93586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204617.93590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204617.93592: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204617.93594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.93596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.93600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.93602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.93604: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204617.93619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.93623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204617.93625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204617.93627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204617.93741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204617.95371: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204617.95403: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204617.95440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpm263qlv4 /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/AnsiballZ_systemd.py <<< 46400 1727204617.95484: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204617.98071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204617.98074: stderr chunk (state=3): >>><<< 46400 1727204617.98077: stdout chunk (state=3): >>><<< 46400 1727204617.98079: done transferring module to remote 46400 1727204617.98081: _low_level_execute_command(): starting 46400 1727204617.98083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/ /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/AnsiballZ_systemd.py && sleep 0' 46400 1727204617.98852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204617.98872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.98882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.98897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.98937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.98942: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204617.98954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.98979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204617.98986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204617.98993: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204617.99001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204617.99010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204617.99021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204617.99029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204617.99035: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204617.99045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204617.99124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204617.99143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204617.99154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204617.99228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.01055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204618.01059: stdout chunk (state=3): >>><<< 46400 1727204618.01071: stderr chunk (state=3): >>><<< 46400 1727204618.01087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204618.01094: _low_level_execute_command(): starting 46400 1727204618.01097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/AnsiballZ_systemd.py && sleep 0' 46400 1727204618.02544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204618.02551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.02561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.02580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.02617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.02621: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204618.02632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.02645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204618.02651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204618.02658: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204618.02688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.02700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.02712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.02715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.02725: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204618.02732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.02802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204618.02817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204618.02820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204618.02906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.28354: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6926336", "MemoryAvailable": "infinity", "CPUUsageNSec": "2265698000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimes<<< 46400 1727204618.28371: stdout chunk (state=3): >>>tampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204618.30045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204618.30049: stdout chunk (state=3): >>><<< 46400 1727204618.30052: stderr chunk (state=3): >>><<< 46400 1727204618.30355: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6926336", "MemoryAvailable": "infinity", "CPUUsageNSec": "2265698000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204618.30373: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204618.30377: _low_level_execute_command(): starting 46400 1727204618.30379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204617.883197-53672-11352705264052/ > /dev/null 2>&1 && sleep 0' 46400 1727204618.30955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204618.30977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.30994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.31014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.31069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.31084: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204618.31100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.31119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204618.31131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204618.31142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204618.31154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.31174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.31190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.31201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.31211: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204618.31225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.31313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204618.31330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204618.31344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204618.31419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.33283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204618.33315: stderr chunk (state=3): >>><<< 46400 1727204618.33319: stdout chunk (state=3): >>><<< 46400 1727204618.33336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204618.33343: handler run complete 46400 1727204618.33410: attempt loop complete, returning result 46400 1727204618.33413: _execute() done 46400 1727204618.33416: dumping result to json 46400 1727204618.33434: done dumping result, returning 46400 1727204618.33443: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-0000000021af] 46400 1727204618.33449: sending task result for task 0affcd87-79f5-1303-fda8-0000000021af ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204618.33759: no more pending results, returning what we have 46400 1727204618.33768: results queue empty 46400 1727204618.33770: checking for any_errors_fatal 46400 1727204618.33781: done checking for any_errors_fatal 46400 1727204618.33782: checking for max_fail_percentage 46400 1727204618.33785: done checking for max_fail_percentage 46400 1727204618.33786: checking to see if all hosts have failed and the running result is not ok 46400 1727204618.33787: done checking to see if all hosts have failed 46400 1727204618.33788: getting the remaining hosts for this loop 46400 1727204618.33789: done getting the remaining hosts for this loop 46400 1727204618.33794: getting the next task for host managed-node2 46400 1727204618.33804: done getting next task for host managed-node2 46400 1727204618.33809: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204618.33814: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204618.33830: getting variables 46400 1727204618.33832: in VariableManager get_vars() 46400 1727204618.33884: Calling all_inventory to load vars for managed-node2 46400 1727204618.33887: Calling groups_inventory to load vars for managed-node2 46400 1727204618.33890: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204618.33901: Calling all_plugins_play to load vars for managed-node2 46400 1727204618.33904: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204618.33907: Calling groups_plugins_play to load vars for managed-node2 46400 1727204618.34483: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021af 46400 1727204618.34491: WORKER PROCESS EXITING 46400 1727204618.36015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204618.37827: done with get_vars() 46400 1727204618.37863: done getting variables 46400 1727204618.37935: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.664) 0:01:48.664 ***** 46400 1727204618.37977: entering _queue_task() for managed-node2/service 46400 1727204618.38362: worker is 1 (out of 1 available) 46400 1727204618.38378: exiting _queue_task() for managed-node2/service 46400 1727204618.38391: done queuing things up, now waiting for results queue to drain 46400 1727204618.38393: waiting for pending results... 46400 1727204618.38726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204618.38932: in run() - task 0affcd87-79f5-1303-fda8-0000000021b0 46400 1727204618.38960: variable 'ansible_search_path' from source: unknown 46400 1727204618.38976: variable 'ansible_search_path' from source: unknown 46400 1727204618.39022: calling self._execute() 46400 1727204618.39134: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.39149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.39163: variable 'omit' from source: magic vars 46400 1727204618.39611: variable 'ansible_distribution_major_version' from source: facts 46400 1727204618.39640: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204618.39777: variable 'network_provider' from source: set_fact 46400 1727204618.39789: Evaluated conditional (network_provider == "nm"): True 46400 1727204618.39897: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204618.40003: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204618.40214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204618.42698: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204618.42793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204618.42842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204618.42893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204618.42927: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204618.43038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204618.43074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204618.43114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204618.43161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204618.43181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204618.43243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204618.43274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204618.43309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204618.43366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204618.43386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204618.43431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204618.43458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204618.43485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204618.43525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204618.43546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204618.43710: variable 'network_connections' from source: include params 46400 1727204618.43731: variable 'interface' from source: play vars 46400 1727204618.43814: variable 'interface' from source: play vars 46400 1727204618.43902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204618.44111: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204618.44158: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204618.44210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204618.44246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204618.44305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204618.44338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204618.44370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204618.44412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204618.44478: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204618.44782: variable 'network_connections' from source: include params 46400 1727204618.44794: variable 'interface' from source: play vars 46400 1727204618.44878: variable 'interface' from source: play vars 46400 1727204618.44925: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204618.44940: when evaluation is False, skipping this task 46400 1727204618.44950: _execute() done 46400 1727204618.44958: dumping result to json 46400 1727204618.44972: done dumping result, returning 46400 1727204618.44984: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-0000000021b0] 46400 1727204618.45004: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b0 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204618.45159: no more pending results, returning what we have 46400 1727204618.45165: results queue empty 46400 1727204618.45166: checking for any_errors_fatal 46400 1727204618.45187: done checking for any_errors_fatal 46400 1727204618.45188: checking for max_fail_percentage 46400 1727204618.45191: done checking for max_fail_percentage 46400 1727204618.45192: checking to see if all hosts have failed and the running result is not ok 46400 1727204618.45193: done checking to see if all hosts have failed 46400 1727204618.45194: getting the remaining hosts for this loop 46400 1727204618.45196: done getting the remaining hosts for this loop 46400 1727204618.45201: getting the next task for host managed-node2 46400 1727204618.45211: done getting next task for host managed-node2 46400 1727204618.45216: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204618.45221: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204618.45245: getting variables 46400 1727204618.45247: in VariableManager get_vars() 46400 1727204618.45303: Calling all_inventory to load vars for managed-node2 46400 1727204618.45306: Calling groups_inventory to load vars for managed-node2 46400 1727204618.45309: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204618.45321: Calling all_plugins_play to load vars for managed-node2 46400 1727204618.45324: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204618.45327: Calling groups_plugins_play to load vars for managed-node2 46400 1727204618.46286: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b0 46400 1727204618.46290: WORKER PROCESS EXITING 46400 1727204618.47310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204618.49084: done with get_vars() 46400 1727204618.49121: done getting variables 46400 1727204618.49203: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.112) 0:01:48.776 ***** 46400 1727204618.49243: entering _queue_task() for managed-node2/service 46400 1727204618.49642: worker is 1 (out of 1 available) 46400 1727204618.49658: exiting _queue_task() for managed-node2/service 46400 1727204618.49676: done queuing things up, now waiting for results queue to drain 46400 1727204618.49678: waiting for pending results... 46400 1727204618.50011: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204618.50202: in run() - task 0affcd87-79f5-1303-fda8-0000000021b1 46400 1727204618.50223: variable 'ansible_search_path' from source: unknown 46400 1727204618.50233: variable 'ansible_search_path' from source: unknown 46400 1727204618.50286: calling self._execute() 46400 1727204618.50406: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.50418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.50431: variable 'omit' from source: magic vars 46400 1727204618.50850: variable 'ansible_distribution_major_version' from source: facts 46400 1727204618.50874: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204618.51013: variable 'network_provider' from source: set_fact 46400 1727204618.51030: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204618.51040: when evaluation is False, skipping this task 46400 1727204618.51046: _execute() done 46400 1727204618.51053: dumping result to json 46400 1727204618.51059: done dumping result, returning 46400 1727204618.51070: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-0000000021b1] 46400 1727204618.51080: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b1 46400 1727204618.51204: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b1 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204618.51251: no more pending results, returning what we have 46400 1727204618.51255: results queue empty 46400 1727204618.51256: checking for any_errors_fatal 46400 1727204618.51267: done checking for any_errors_fatal 46400 1727204618.51269: checking for max_fail_percentage 46400 1727204618.51271: done checking for max_fail_percentage 46400 1727204618.51272: checking to see if all hosts have failed and the running result is not ok 46400 1727204618.51273: done checking to see if all hosts have failed 46400 1727204618.51274: getting the remaining hosts for this loop 46400 1727204618.51275: done getting the remaining hosts for this loop 46400 1727204618.51280: getting the next task for host managed-node2 46400 1727204618.51291: done getting next task for host managed-node2 46400 1727204618.51297: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204618.51305: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204618.51329: getting variables 46400 1727204618.51331: in VariableManager get_vars() 46400 1727204618.51381: Calling all_inventory to load vars for managed-node2 46400 1727204618.51384: Calling groups_inventory to load vars for managed-node2 46400 1727204618.51386: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204618.51399: Calling all_plugins_play to load vars for managed-node2 46400 1727204618.51402: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204618.51404: Calling groups_plugins_play to load vars for managed-node2 46400 1727204618.52383: WORKER PROCESS EXITING 46400 1727204618.53499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204618.55250: done with get_vars() 46400 1727204618.55298: done getting variables 46400 1727204618.55367: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.061) 0:01:48.838 ***** 46400 1727204618.55415: entering _queue_task() for managed-node2/copy 46400 1727204618.55806: worker is 1 (out of 1 available) 46400 1727204618.55829: exiting _queue_task() for managed-node2/copy 46400 1727204618.55841: done queuing things up, now waiting for results queue to drain 46400 1727204618.55843: waiting for pending results... 46400 1727204618.56172: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204618.56339: in run() - task 0affcd87-79f5-1303-fda8-0000000021b2 46400 1727204618.56365: variable 'ansible_search_path' from source: unknown 46400 1727204618.56376: variable 'ansible_search_path' from source: unknown 46400 1727204618.56417: calling self._execute() 46400 1727204618.56530: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.56542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.56556: variable 'omit' from source: magic vars 46400 1727204618.56966: variable 'ansible_distribution_major_version' from source: facts 46400 1727204618.56983: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204618.57116: variable 'network_provider' from source: set_fact 46400 1727204618.57136: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204618.57145: when evaluation is False, skipping this task 46400 1727204618.57157: _execute() done 46400 1727204618.57167: dumping result to json 46400 1727204618.57175: done dumping result, returning 46400 1727204618.57186: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-0000000021b2] 46400 1727204618.57197: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b2 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204618.57357: no more pending results, returning what we have 46400 1727204618.57361: results queue empty 46400 1727204618.57363: checking for any_errors_fatal 46400 1727204618.57372: done checking for any_errors_fatal 46400 1727204618.57373: checking for max_fail_percentage 46400 1727204618.57375: done checking for max_fail_percentage 46400 1727204618.57376: checking to see if all hosts have failed and the running result is not ok 46400 1727204618.57377: done checking to see if all hosts have failed 46400 1727204618.57377: getting the remaining hosts for this loop 46400 1727204618.57379: done getting the remaining hosts for this loop 46400 1727204618.57383: getting the next task for host managed-node2 46400 1727204618.57392: done getting next task for host managed-node2 46400 1727204618.57397: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204618.57403: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204618.57425: getting variables 46400 1727204618.57427: in VariableManager get_vars() 46400 1727204618.57486: Calling all_inventory to load vars for managed-node2 46400 1727204618.57490: Calling groups_inventory to load vars for managed-node2 46400 1727204618.57492: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204618.57508: Calling all_plugins_play to load vars for managed-node2 46400 1727204618.57511: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204618.57514: Calling groups_plugins_play to load vars for managed-node2 46400 1727204618.58515: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b2 46400 1727204618.58519: WORKER PROCESS EXITING 46400 1727204618.59466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204618.61460: done with get_vars() 46400 1727204618.61495: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.061) 0:01:48.900 ***** 46400 1727204618.61596: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204618.61969: worker is 1 (out of 1 available) 46400 1727204618.61983: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204618.61996: done queuing things up, now waiting for results queue to drain 46400 1727204618.61998: waiting for pending results... 46400 1727204618.62304: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204618.62475: in run() - task 0affcd87-79f5-1303-fda8-0000000021b3 46400 1727204618.62507: variable 'ansible_search_path' from source: unknown 46400 1727204618.62520: variable 'ansible_search_path' from source: unknown 46400 1727204618.62588: calling self._execute() 46400 1727204618.62709: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.62721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.62736: variable 'omit' from source: magic vars 46400 1727204618.63166: variable 'ansible_distribution_major_version' from source: facts 46400 1727204618.63184: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204618.63201: variable 'omit' from source: magic vars 46400 1727204618.63281: variable 'omit' from source: magic vars 46400 1727204618.63477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204618.66217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204618.66306: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204618.66352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204618.66414: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204618.66451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204618.66556: variable 'network_provider' from source: set_fact 46400 1727204618.66725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204618.66759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204618.66795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204618.66855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204618.66880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204618.66976: variable 'omit' from source: magic vars 46400 1727204618.67106: variable 'omit' from source: magic vars 46400 1727204618.67223: variable 'network_connections' from source: include params 46400 1727204618.67242: variable 'interface' from source: play vars 46400 1727204618.67316: variable 'interface' from source: play vars 46400 1727204618.67497: variable 'omit' from source: magic vars 46400 1727204618.67511: variable '__lsr_ansible_managed' from source: task vars 46400 1727204618.67579: variable '__lsr_ansible_managed' from source: task vars 46400 1727204618.67811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204618.68051: Loaded config def from plugin (lookup/template) 46400 1727204618.68060: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204618.68094: File lookup term: get_ansible_managed.j2 46400 1727204618.68101: variable 'ansible_search_path' from source: unknown 46400 1727204618.68109: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204618.68124: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204618.68162: variable 'ansible_search_path' from source: unknown 46400 1727204618.85986: variable 'ansible_managed' from source: unknown 46400 1727204618.86339: variable 'omit' from source: magic vars 46400 1727204618.86379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204618.86412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204618.86429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204618.86446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204618.86457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204618.86480: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204618.86487: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.86502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.86587: Set connection var ansible_shell_type to sh 46400 1727204618.86730: Set connection var ansible_shell_executable to /bin/sh 46400 1727204618.86740: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204618.86750: Set connection var ansible_connection to ssh 46400 1727204618.86759: Set connection var ansible_pipelining to False 46400 1727204618.86773: Set connection var ansible_timeout to 10 46400 1727204618.86804: variable 'ansible_shell_executable' from source: unknown 46400 1727204618.86943: variable 'ansible_connection' from source: unknown 46400 1727204618.86952: variable 'ansible_module_compression' from source: unknown 46400 1727204618.86959: variable 'ansible_shell_type' from source: unknown 46400 1727204618.86968: variable 'ansible_shell_executable' from source: unknown 46400 1727204618.86976: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204618.86984: variable 'ansible_pipelining' from source: unknown 46400 1727204618.86990: variable 'ansible_timeout' from source: unknown 46400 1727204618.86998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204618.87139: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204618.87284: variable 'omit' from source: magic vars 46400 1727204618.87295: starting attempt loop 46400 1727204618.87302: running the handler 46400 1727204618.87317: _low_level_execute_command(): starting 46400 1727204618.87385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204618.88635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204618.88650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.88666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.88738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.88750: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204618.88763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.88783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204618.88795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204618.88812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204618.88823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.88835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.88851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.88863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.88876: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204618.88889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.88974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204618.88997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204618.89012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204618.89094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.90746: stdout chunk (state=3): >>>/root <<< 46400 1727204618.90879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204618.90953: stderr chunk (state=3): >>><<< 46400 1727204618.90957: stdout chunk (state=3): >>><<< 46400 1727204618.91077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204618.91080: _low_level_execute_command(): starting 46400 1727204618.91083: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541 `" && echo ansible-tmp-1727204618.9098487-53732-38544040928541="` echo /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541 `" ) && sleep 0' 46400 1727204618.91670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204618.91685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.91701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.91721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.91773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.91787: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204618.91802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.91820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204618.91834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204618.91853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204618.91867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.91882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.91898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.91911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.91922: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204618.91935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.92020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204618.92043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204618.92073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204618.92145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.93985: stdout chunk (state=3): >>>ansible-tmp-1727204618.9098487-53732-38544040928541=/root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541 <<< 46400 1727204618.94188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204618.94192: stdout chunk (state=3): >>><<< 46400 1727204618.94194: stderr chunk (state=3): >>><<< 46400 1727204618.94516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204618.9098487-53732-38544040928541=/root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204618.94523: variable 'ansible_module_compression' from source: unknown 46400 1727204618.94526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204618.94528: variable 'ansible_facts' from source: unknown 46400 1727204618.94532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/AnsiballZ_network_connections.py 46400 1727204618.94624: Sending initial data 46400 1727204618.94628: Sent initial data (167 bytes) 46400 1727204618.95577: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204618.95586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.95597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.95610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.95652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.95662: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204618.95673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.95687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204618.95696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204618.95701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204618.95710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204618.95720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204618.95727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204618.95734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204618.95740: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204618.95749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204618.95815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204618.95841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204618.95844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204618.95955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204618.97617: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204618.97658: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204618.97701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp87bz4_ba /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/AnsiballZ_network_connections.py <<< 46400 1727204618.97747: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204618.99689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204618.99875: stderr chunk (state=3): >>><<< 46400 1727204618.99879: stdout chunk (state=3): >>><<< 46400 1727204618.99881: done transferring module to remote 46400 1727204618.99883: _low_level_execute_command(): starting 46400 1727204618.99886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/ /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/AnsiballZ_network_connections.py && sleep 0' 46400 1727204619.00624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204619.00648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.00672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.00692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.00737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.00757: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204619.00782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.00801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204619.00814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204619.00826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204619.00839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.00854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.00885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.00898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.00910: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204619.00924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.01010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204619.01033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204619.01050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204619.01125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204619.02931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204619.02935: stdout chunk (state=3): >>><<< 46400 1727204619.02937: stderr chunk (state=3): >>><<< 46400 1727204619.03042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204619.03050: _low_level_execute_command(): starting 46400 1727204619.03052: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/AnsiballZ_network_connections.py && sleep 0' 46400 1727204619.03672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204619.03687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.03709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.03730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.03780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.03793: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204619.03814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.03833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204619.03845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204619.03856: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204619.03875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.03889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.03905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.03923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.03936: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204619.03950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.04035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204619.04063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204619.04084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204619.04169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204619.29304: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204619.31988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204619.32067: stderr chunk (state=3): >>><<< 46400 1727204619.32071: stdout chunk (state=3): >>><<< 46400 1727204619.32173: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204619.32177: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204619.32180: _low_level_execute_command(): starting 46400 1727204619.32183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204618.9098487-53732-38544040928541/ > /dev/null 2>&1 && sleep 0' 46400 1727204619.34575: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204619.34584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.34595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.34608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.34649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.34662: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204619.34673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.34687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204619.34694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204619.34701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204619.34708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204619.34717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204619.34728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204619.34735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204619.34742: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204619.34751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204619.34826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204619.34887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204619.34895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204619.35057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204619.36895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204619.36899: stdout chunk (state=3): >>><<< 46400 1727204619.36906: stderr chunk (state=3): >>><<< 46400 1727204619.36923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204619.36928: handler run complete 46400 1727204619.36967: attempt loop complete, returning result 46400 1727204619.36970: _execute() done 46400 1727204619.36973: dumping result to json 46400 1727204619.36975: done dumping result, returning 46400 1727204619.36985: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-0000000021b3] 46400 1727204619.36987: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b3 46400 1727204619.37103: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b3 46400 1727204619.37106: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 46400 1727204619.37203: no more pending results, returning what we have 46400 1727204619.37206: results queue empty 46400 1727204619.37207: checking for any_errors_fatal 46400 1727204619.37212: done checking for any_errors_fatal 46400 1727204619.37213: checking for max_fail_percentage 46400 1727204619.37215: done checking for max_fail_percentage 46400 1727204619.37216: checking to see if all hosts have failed and the running result is not ok 46400 1727204619.37217: done checking to see if all hosts have failed 46400 1727204619.37217: getting the remaining hosts for this loop 46400 1727204619.37219: done getting the remaining hosts for this loop 46400 1727204619.37222: getting the next task for host managed-node2 46400 1727204619.37229: done getting next task for host managed-node2 46400 1727204619.37233: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204619.37237: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204619.37249: getting variables 46400 1727204619.37250: in VariableManager get_vars() 46400 1727204619.37295: Calling all_inventory to load vars for managed-node2 46400 1727204619.37298: Calling groups_inventory to load vars for managed-node2 46400 1727204619.37300: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204619.37309: Calling all_plugins_play to load vars for managed-node2 46400 1727204619.37312: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204619.37314: Calling groups_plugins_play to load vars for managed-node2 46400 1727204619.40198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204619.45066: done with get_vars() 46400 1727204619.45110: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.836) 0:01:49.736 ***** 46400 1727204619.45213: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204619.46229: worker is 1 (out of 1 available) 46400 1727204619.46243: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204619.46257: done queuing things up, now waiting for results queue to drain 46400 1727204619.46259: waiting for pending results... 46400 1727204619.47310: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204619.47606: in run() - task 0affcd87-79f5-1303-fda8-0000000021b4 46400 1727204619.47778: variable 'ansible_search_path' from source: unknown 46400 1727204619.47788: variable 'ansible_search_path' from source: unknown 46400 1727204619.47832: calling self._execute() 46400 1727204619.48156: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.48175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.48201: variable 'omit' from source: magic vars 46400 1727204619.49015: variable 'ansible_distribution_major_version' from source: facts 46400 1727204619.49186: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204619.49331: variable 'network_state' from source: role '' defaults 46400 1727204619.49404: Evaluated conditional (network_state != {}): False 46400 1727204619.49504: when evaluation is False, skipping this task 46400 1727204619.49511: _execute() done 46400 1727204619.49518: dumping result to json 46400 1727204619.49523: done dumping result, returning 46400 1727204619.49533: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-0000000021b4] 46400 1727204619.49541: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b4 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204619.49707: no more pending results, returning what we have 46400 1727204619.49712: results queue empty 46400 1727204619.49713: checking for any_errors_fatal 46400 1727204619.49725: done checking for any_errors_fatal 46400 1727204619.49726: checking for max_fail_percentage 46400 1727204619.49728: done checking for max_fail_percentage 46400 1727204619.49729: checking to see if all hosts have failed and the running result is not ok 46400 1727204619.49729: done checking to see if all hosts have failed 46400 1727204619.49730: getting the remaining hosts for this loop 46400 1727204619.49732: done getting the remaining hosts for this loop 46400 1727204619.49736: getting the next task for host managed-node2 46400 1727204619.49745: done getting next task for host managed-node2 46400 1727204619.49749: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204619.49754: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204619.49782: getting variables 46400 1727204619.49784: in VariableManager get_vars() 46400 1727204619.49841: Calling all_inventory to load vars for managed-node2 46400 1727204619.49845: Calling groups_inventory to load vars for managed-node2 46400 1727204619.49848: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204619.49868: Calling all_plugins_play to load vars for managed-node2 46400 1727204619.49872: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204619.49876: Calling groups_plugins_play to load vars for managed-node2 46400 1727204619.51467: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b4 46400 1727204619.51471: WORKER PROCESS EXITING 46400 1727204619.53423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204619.70952: done with get_vars() 46400 1727204619.70992: done getting variables 46400 1727204619.71044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.258) 0:01:49.995 ***** 46400 1727204619.71078: entering _queue_task() for managed-node2/debug 46400 1727204619.71441: worker is 1 (out of 1 available) 46400 1727204619.71458: exiting _queue_task() for managed-node2/debug 46400 1727204619.71475: done queuing things up, now waiting for results queue to drain 46400 1727204619.71477: waiting for pending results... 46400 1727204619.71788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204619.71990: in run() - task 0affcd87-79f5-1303-fda8-0000000021b5 46400 1727204619.72051: variable 'ansible_search_path' from source: unknown 46400 1727204619.72061: variable 'ansible_search_path' from source: unknown 46400 1727204619.72111: calling self._execute() 46400 1727204619.73130: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.73146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.73167: variable 'omit' from source: magic vars 46400 1727204619.74042: variable 'ansible_distribution_major_version' from source: facts 46400 1727204619.74066: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204619.74079: variable 'omit' from source: magic vars 46400 1727204619.74182: variable 'omit' from source: magic vars 46400 1727204619.74372: variable 'omit' from source: magic vars 46400 1727204619.74422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204619.74589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204619.74620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204619.74646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204619.74673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204619.74797: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204619.74806: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.74814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.75030: Set connection var ansible_shell_type to sh 46400 1727204619.75043: Set connection var ansible_shell_executable to /bin/sh 46400 1727204619.75050: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204619.75057: Set connection var ansible_connection to ssh 46400 1727204619.75068: Set connection var ansible_pipelining to False 46400 1727204619.75076: Set connection var ansible_timeout to 10 46400 1727204619.75108: variable 'ansible_shell_executable' from source: unknown 46400 1727204619.75205: variable 'ansible_connection' from source: unknown 46400 1727204619.75212: variable 'ansible_module_compression' from source: unknown 46400 1727204619.75218: variable 'ansible_shell_type' from source: unknown 46400 1727204619.75223: variable 'ansible_shell_executable' from source: unknown 46400 1727204619.75230: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.75238: variable 'ansible_pipelining' from source: unknown 46400 1727204619.75246: variable 'ansible_timeout' from source: unknown 46400 1727204619.75254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.75642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204619.75668: variable 'omit' from source: magic vars 46400 1727204619.75680: starting attempt loop 46400 1727204619.75687: running the handler 46400 1727204619.75971: variable '__network_connections_result' from source: set_fact 46400 1727204619.76038: handler run complete 46400 1727204619.76205: attempt loop complete, returning result 46400 1727204619.76216: _execute() done 46400 1727204619.76224: dumping result to json 46400 1727204619.76232: done dumping result, returning 46400 1727204619.76244: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-0000000021b5] 46400 1727204619.76254: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b5 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807" ] } 46400 1727204619.76470: no more pending results, returning what we have 46400 1727204619.76475: results queue empty 46400 1727204619.76477: checking for any_errors_fatal 46400 1727204619.76485: done checking for any_errors_fatal 46400 1727204619.76486: checking for max_fail_percentage 46400 1727204619.76488: done checking for max_fail_percentage 46400 1727204619.76489: checking to see if all hosts have failed and the running result is not ok 46400 1727204619.76490: done checking to see if all hosts have failed 46400 1727204619.76491: getting the remaining hosts for this loop 46400 1727204619.76492: done getting the remaining hosts for this loop 46400 1727204619.76496: getting the next task for host managed-node2 46400 1727204619.76506: done getting next task for host managed-node2 46400 1727204619.76511: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204619.76519: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204619.76537: getting variables 46400 1727204619.76539: in VariableManager get_vars() 46400 1727204619.76594: Calling all_inventory to load vars for managed-node2 46400 1727204619.76597: Calling groups_inventory to load vars for managed-node2 46400 1727204619.76600: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204619.76611: Calling all_plugins_play to load vars for managed-node2 46400 1727204619.76614: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204619.76617: Calling groups_plugins_play to load vars for managed-node2 46400 1727204619.77978: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b5 46400 1727204619.77982: WORKER PROCESS EXITING 46400 1727204619.79871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204619.83378: done with get_vars() 46400 1727204619.83533: done getting variables 46400 1727204619.83605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.126) 0:01:50.122 ***** 46400 1727204619.83771: entering _queue_task() for managed-node2/debug 46400 1727204619.84490: worker is 1 (out of 1 available) 46400 1727204619.84503: exiting _queue_task() for managed-node2/debug 46400 1727204619.84632: done queuing things up, now waiting for results queue to drain 46400 1727204619.84635: waiting for pending results... 46400 1727204619.85571: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204619.85813: in run() - task 0affcd87-79f5-1303-fda8-0000000021b6 46400 1727204619.85834: variable 'ansible_search_path' from source: unknown 46400 1727204619.85843: variable 'ansible_search_path' from source: unknown 46400 1727204619.85895: calling self._execute() 46400 1727204619.86022: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.86042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.86057: variable 'omit' from source: magic vars 46400 1727204619.87026: variable 'ansible_distribution_major_version' from source: facts 46400 1727204619.87044: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204619.87057: variable 'omit' from source: magic vars 46400 1727204619.87184: variable 'omit' from source: magic vars 46400 1727204619.87372: variable 'omit' from source: magic vars 46400 1727204619.87422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204619.87484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204619.87576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204619.87681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204619.87697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204619.87736: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204619.87774: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.87783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.87972: Set connection var ansible_shell_type to sh 46400 1727204619.88104: Set connection var ansible_shell_executable to /bin/sh 46400 1727204619.88113: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204619.88120: Set connection var ansible_connection to ssh 46400 1727204619.88127: Set connection var ansible_pipelining to False 46400 1727204619.88134: Set connection var ansible_timeout to 10 46400 1727204619.88158: variable 'ansible_shell_executable' from source: unknown 46400 1727204619.88169: variable 'ansible_connection' from source: unknown 46400 1727204619.88175: variable 'ansible_module_compression' from source: unknown 46400 1727204619.88180: variable 'ansible_shell_type' from source: unknown 46400 1727204619.88185: variable 'ansible_shell_executable' from source: unknown 46400 1727204619.88190: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.88200: variable 'ansible_pipelining' from source: unknown 46400 1727204619.88209: variable 'ansible_timeout' from source: unknown 46400 1727204619.88215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.88481: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204619.88553: variable 'omit' from source: magic vars 46400 1727204619.88647: starting attempt loop 46400 1727204619.88654: running the handler 46400 1727204619.88713: variable '__network_connections_result' from source: set_fact 46400 1727204619.88839: variable '__network_connections_result' from source: set_fact 46400 1727204619.89211: handler run complete 46400 1727204619.89246: attempt loop complete, returning result 46400 1727204619.89296: _execute() done 46400 1727204619.89307: dumping result to json 46400 1727204619.89407: done dumping result, returning 46400 1727204619.89422: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-0000000021b6] 46400 1727204619.89434: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b6 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807" ] } } 46400 1727204619.89663: no more pending results, returning what we have 46400 1727204619.89669: results queue empty 46400 1727204619.89670: checking for any_errors_fatal 46400 1727204619.89678: done checking for any_errors_fatal 46400 1727204619.89679: checking for max_fail_percentage 46400 1727204619.89681: done checking for max_fail_percentage 46400 1727204619.89682: checking to see if all hosts have failed and the running result is not ok 46400 1727204619.89683: done checking to see if all hosts have failed 46400 1727204619.89684: getting the remaining hosts for this loop 46400 1727204619.89686: done getting the remaining hosts for this loop 46400 1727204619.89690: getting the next task for host managed-node2 46400 1727204619.89700: done getting next task for host managed-node2 46400 1727204619.89706: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204619.89712: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204619.89728: getting variables 46400 1727204619.89730: in VariableManager get_vars() 46400 1727204619.89786: Calling all_inventory to load vars for managed-node2 46400 1727204619.89790: Calling groups_inventory to load vars for managed-node2 46400 1727204619.89798: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204619.89810: Calling all_plugins_play to load vars for managed-node2 46400 1727204619.89813: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204619.89816: Calling groups_plugins_play to load vars for managed-node2 46400 1727204619.90843: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b6 46400 1727204619.90846: WORKER PROCESS EXITING 46400 1727204619.93594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204619.97153: done with get_vars() 46400 1727204619.97312: done getting variables 46400 1727204619.97381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.137) 0:01:50.259 ***** 46400 1727204619.97533: entering _queue_task() for managed-node2/debug 46400 1727204619.98247: worker is 1 (out of 1 available) 46400 1727204619.98375: exiting _queue_task() for managed-node2/debug 46400 1727204619.98388: done queuing things up, now waiting for results queue to drain 46400 1727204619.98389: waiting for pending results... 46400 1727204619.99365: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204619.99497: in run() - task 0affcd87-79f5-1303-fda8-0000000021b7 46400 1727204619.99513: variable 'ansible_search_path' from source: unknown 46400 1727204619.99518: variable 'ansible_search_path' from source: unknown 46400 1727204619.99551: calling self._execute() 46400 1727204619.99653: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204619.99661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204619.99666: variable 'omit' from source: magic vars 46400 1727204620.00741: variable 'ansible_distribution_major_version' from source: facts 46400 1727204620.00754: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204620.00887: variable 'network_state' from source: role '' defaults 46400 1727204620.00899: Evaluated conditional (network_state != {}): False 46400 1727204620.00903: when evaluation is False, skipping this task 46400 1727204620.00906: _execute() done 46400 1727204620.00908: dumping result to json 46400 1727204620.00911: done dumping result, returning 46400 1727204620.00913: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-0000000021b7] 46400 1727204620.00923: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b7 46400 1727204620.01027: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b7 46400 1727204620.01030: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204620.01082: no more pending results, returning what we have 46400 1727204620.01086: results queue empty 46400 1727204620.01087: checking for any_errors_fatal 46400 1727204620.01097: done checking for any_errors_fatal 46400 1727204620.01098: checking for max_fail_percentage 46400 1727204620.01100: done checking for max_fail_percentage 46400 1727204620.01101: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.01101: done checking to see if all hosts have failed 46400 1727204620.01102: getting the remaining hosts for this loop 46400 1727204620.01104: done getting the remaining hosts for this loop 46400 1727204620.01108: getting the next task for host managed-node2 46400 1727204620.01117: done getting next task for host managed-node2 46400 1727204620.01121: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204620.01127: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.01152: getting variables 46400 1727204620.01154: in VariableManager get_vars() 46400 1727204620.01207: Calling all_inventory to load vars for managed-node2 46400 1727204620.01211: Calling groups_inventory to load vars for managed-node2 46400 1727204620.01213: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.01227: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.01230: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.01233: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.03125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.05611: done with get_vars() 46400 1727204620.05645: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.082) 0:01:50.342 ***** 46400 1727204620.05763: entering _queue_task() for managed-node2/ping 46400 1727204620.06137: worker is 1 (out of 1 available) 46400 1727204620.06150: exiting _queue_task() for managed-node2/ping 46400 1727204620.06166: done queuing things up, now waiting for results queue to drain 46400 1727204620.06168: waiting for pending results... 46400 1727204620.06716: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204620.06890: in run() - task 0affcd87-79f5-1303-fda8-0000000021b8 46400 1727204620.06913: variable 'ansible_search_path' from source: unknown 46400 1727204620.06920: variable 'ansible_search_path' from source: unknown 46400 1727204620.06974: calling self._execute() 46400 1727204620.07093: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.07104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.07117: variable 'omit' from source: magic vars 46400 1727204620.07623: variable 'ansible_distribution_major_version' from source: facts 46400 1727204620.07642: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204620.07652: variable 'omit' from source: magic vars 46400 1727204620.07730: variable 'omit' from source: magic vars 46400 1727204620.07773: variable 'omit' from source: magic vars 46400 1727204620.07818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204620.07871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204620.07896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204620.07918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204620.07935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204620.07982: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204620.07991: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.07998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.08104: Set connection var ansible_shell_type to sh 46400 1727204620.08119: Set connection var ansible_shell_executable to /bin/sh 46400 1727204620.08128: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204620.08137: Set connection var ansible_connection to ssh 46400 1727204620.08145: Set connection var ansible_pipelining to False 46400 1727204620.08161: Set connection var ansible_timeout to 10 46400 1727204620.08196: variable 'ansible_shell_executable' from source: unknown 46400 1727204620.08203: variable 'ansible_connection' from source: unknown 46400 1727204620.08210: variable 'ansible_module_compression' from source: unknown 46400 1727204620.08216: variable 'ansible_shell_type' from source: unknown 46400 1727204620.08222: variable 'ansible_shell_executable' from source: unknown 46400 1727204620.08227: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.08234: variable 'ansible_pipelining' from source: unknown 46400 1727204620.08239: variable 'ansible_timeout' from source: unknown 46400 1727204620.08245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.08466: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204620.08614: variable 'omit' from source: magic vars 46400 1727204620.08624: starting attempt loop 46400 1727204620.08630: running the handler 46400 1727204620.08646: _low_level_execute_command(): starting 46400 1727204620.08656: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204620.09750: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204620.09772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.09790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.09817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.09868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.09883: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204620.09898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.09926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204620.09940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204620.09952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204620.09970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.09986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.10002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.10017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.10031: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204620.10051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.10143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204620.10174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.10192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.10281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.11943: stdout chunk (state=3): >>>/root <<< 46400 1727204620.12143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204620.12147: stdout chunk (state=3): >>><<< 46400 1727204620.12149: stderr chunk (state=3): >>><<< 46400 1727204620.12206: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204620.12307: _low_level_execute_command(): starting 46400 1727204620.12311: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017 `" && echo ansible-tmp-1727204620.1217685-53873-166866111896017="` echo /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017 `" ) && sleep 0' 46400 1727204620.13317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.13321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.13352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.13355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.13357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204620.13362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.13503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.13703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.13707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.15572: stdout chunk (state=3): >>>ansible-tmp-1727204620.1217685-53873-166866111896017=/root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017 <<< 46400 1727204620.15751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204620.15755: stdout chunk (state=3): >>><<< 46400 1727204620.15766: stderr chunk (state=3): >>><<< 46400 1727204620.15788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.1217685-53873-166866111896017=/root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204620.15837: variable 'ansible_module_compression' from source: unknown 46400 1727204620.15881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204620.15917: variable 'ansible_facts' from source: unknown 46400 1727204620.15971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/AnsiballZ_ping.py 46400 1727204620.16288: Sending initial data 46400 1727204620.16292: Sent initial data (153 bytes) 46400 1727204620.18114: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204620.18123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.18134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.18148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.18190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.18195: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204620.18207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.18221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204620.18229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204620.18236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204620.18244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.18254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.18267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.18276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.18282: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204620.18292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.18375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204620.18382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.18385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.18617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.20340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204620.20379: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204620.20419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp1ewsmbx1 /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/AnsiballZ_ping.py <<< 46400 1727204620.20454: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204620.21839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204620.21979: stderr chunk (state=3): >>><<< 46400 1727204620.21982: stdout chunk (state=3): >>><<< 46400 1727204620.22004: done transferring module to remote 46400 1727204620.22015: _low_level_execute_command(): starting 46400 1727204620.22020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/ /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/AnsiballZ_ping.py && sleep 0' 46400 1727204620.23698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.23702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.23778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.23854: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204620.23857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.23881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204620.23886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.23892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.23897: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204620.23909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.24100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204620.24114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.24120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.24197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.26043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204620.26047: stderr chunk (state=3): >>><<< 46400 1727204620.26050: stdout chunk (state=3): >>><<< 46400 1727204620.26076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204620.26080: _low_level_execute_command(): starting 46400 1727204620.26083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/AnsiballZ_ping.py && sleep 0' 46400 1727204620.27968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204620.27988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.28004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.28025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.28183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.28194: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204620.28207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.28225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204620.28237: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204620.28249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204620.28268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.28284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.28299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.28310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.28321: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204620.28334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.28421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204620.28494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.28512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.28655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.41511: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204620.42749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204620.42753: stdout chunk (state=3): >>><<< 46400 1727204620.42756: stderr chunk (state=3): >>><<< 46400 1727204620.42902: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204620.42907: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204620.42910: _low_level_execute_command(): starting 46400 1727204620.42913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.1217685-53873-166866111896017/ > /dev/null 2>&1 && sleep 0' 46400 1727204620.44247: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204620.44257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.44268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.44284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.44321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.44328: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204620.44338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.44351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204620.44479: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204620.44486: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204620.44494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204620.44503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204620.44515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204620.44522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204620.44529: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204620.44539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204620.45072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204620.45075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204620.45078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204620.45080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204620.46582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204620.46601: stderr chunk (state=3): >>><<< 46400 1727204620.46604: stdout chunk (state=3): >>><<< 46400 1727204620.46624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204620.46631: handler run complete 46400 1727204620.46649: attempt loop complete, returning result 46400 1727204620.46652: _execute() done 46400 1727204620.46655: dumping result to json 46400 1727204620.46657: done dumping result, returning 46400 1727204620.46671: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-0000000021b8] 46400 1727204620.46676: sending task result for task 0affcd87-79f5-1303-fda8-0000000021b8 46400 1727204620.46778: done sending task result for task 0affcd87-79f5-1303-fda8-0000000021b8 46400 1727204620.46781: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204620.46841: no more pending results, returning what we have 46400 1727204620.46845: results queue empty 46400 1727204620.46846: checking for any_errors_fatal 46400 1727204620.46854: done checking for any_errors_fatal 46400 1727204620.46854: checking for max_fail_percentage 46400 1727204620.46856: done checking for max_fail_percentage 46400 1727204620.46857: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.46858: done checking to see if all hosts have failed 46400 1727204620.46861: getting the remaining hosts for this loop 46400 1727204620.46863: done getting the remaining hosts for this loop 46400 1727204620.46868: getting the next task for host managed-node2 46400 1727204620.46882: done getting next task for host managed-node2 46400 1727204620.46885: ^ task is: TASK: meta (role_complete) 46400 1727204620.46898: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.46911: getting variables 46400 1727204620.46913: in VariableManager get_vars() 46400 1727204620.46958: Calling all_inventory to load vars for managed-node2 46400 1727204620.46962: Calling groups_inventory to load vars for managed-node2 46400 1727204620.46967: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.46977: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.46979: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.46982: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.50808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.53028: done with get_vars() 46400 1727204620.53066: done getting variables 46400 1727204620.53156: done queuing things up, now waiting for results queue to drain 46400 1727204620.53158: results queue empty 46400 1727204620.53162: checking for any_errors_fatal 46400 1727204620.53168: done checking for any_errors_fatal 46400 1727204620.53191: checking for max_fail_percentage 46400 1727204620.53193: done checking for max_fail_percentage 46400 1727204620.53194: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.53195: done checking to see if all hosts have failed 46400 1727204620.53195: getting the remaining hosts for this loop 46400 1727204620.53197: done getting the remaining hosts for this loop 46400 1727204620.53204: getting the next task for host managed-node2 46400 1727204620.53236: done getting next task for host managed-node2 46400 1727204620.53246: ^ task is: TASK: Show result 46400 1727204620.53249: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.53257: getting variables 46400 1727204620.53259: in VariableManager get_vars() 46400 1727204620.53283: Calling all_inventory to load vars for managed-node2 46400 1727204620.53285: Calling groups_inventory to load vars for managed-node2 46400 1727204620.53288: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.53294: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.53301: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.53305: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.54621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.56605: done with get_vars() 46400 1727204620.56628: done getting variables 46400 1727204620.56687: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.509) 0:01:50.851 ***** 46400 1727204620.56721: entering _queue_task() for managed-node2/debug 46400 1727204620.57784: worker is 1 (out of 1 available) 46400 1727204620.57910: exiting _queue_task() for managed-node2/debug 46400 1727204620.57924: done queuing things up, now waiting for results queue to drain 46400 1727204620.57926: waiting for pending results... 46400 1727204620.59252: running TaskExecutor() for managed-node2/TASK: Show result 46400 1727204620.59369: in run() - task 0affcd87-79f5-1303-fda8-00000000213a 46400 1727204620.59381: variable 'ansible_search_path' from source: unknown 46400 1727204620.59384: variable 'ansible_search_path' from source: unknown 46400 1727204620.59421: calling self._execute() 46400 1727204620.59521: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.59527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.59538: variable 'omit' from source: magic vars 46400 1727204620.59924: variable 'ansible_distribution_major_version' from source: facts 46400 1727204620.59935: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204620.59941: variable 'omit' from source: magic vars 46400 1727204620.59999: variable 'omit' from source: magic vars 46400 1727204620.60036: variable 'omit' from source: magic vars 46400 1727204620.60083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204620.60121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204620.60144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204620.60163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204620.60174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204620.60203: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204620.60212: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.60215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.60309: Set connection var ansible_shell_type to sh 46400 1727204620.60318: Set connection var ansible_shell_executable to /bin/sh 46400 1727204620.60323: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204620.60329: Set connection var ansible_connection to ssh 46400 1727204620.60334: Set connection var ansible_pipelining to False 46400 1727204620.60340: Set connection var ansible_timeout to 10 46400 1727204620.60366: variable 'ansible_shell_executable' from source: unknown 46400 1727204620.60370: variable 'ansible_connection' from source: unknown 46400 1727204620.60373: variable 'ansible_module_compression' from source: unknown 46400 1727204620.60375: variable 'ansible_shell_type' from source: unknown 46400 1727204620.60377: variable 'ansible_shell_executable' from source: unknown 46400 1727204620.60380: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.60382: variable 'ansible_pipelining' from source: unknown 46400 1727204620.60386: variable 'ansible_timeout' from source: unknown 46400 1727204620.60390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.60531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204620.60541: variable 'omit' from source: magic vars 46400 1727204620.60547: starting attempt loop 46400 1727204620.60550: running the handler 46400 1727204620.60597: variable '__network_connections_result' from source: set_fact 46400 1727204620.60681: variable '__network_connections_result' from source: set_fact 46400 1727204620.60795: handler run complete 46400 1727204620.60821: attempt loop complete, returning result 46400 1727204620.60825: _execute() done 46400 1727204620.60828: dumping result to json 46400 1727204620.60830: done dumping result, returning 46400 1727204620.60840: done running TaskExecutor() for managed-node2/TASK: Show result [0affcd87-79f5-1303-fda8-00000000213a] 46400 1727204620.60847: sending task result for task 0affcd87-79f5-1303-fda8-00000000213a 46400 1727204620.60943: done sending task result for task 0affcd87-79f5-1303-fda8-00000000213a 46400 1727204620.60945: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807" ] } } 46400 1727204620.61035: no more pending results, returning what we have 46400 1727204620.61039: results queue empty 46400 1727204620.61040: checking for any_errors_fatal 46400 1727204620.61042: done checking for any_errors_fatal 46400 1727204620.61043: checking for max_fail_percentage 46400 1727204620.61044: done checking for max_fail_percentage 46400 1727204620.61045: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.61046: done checking to see if all hosts have failed 46400 1727204620.61047: getting the remaining hosts for this loop 46400 1727204620.61048: done getting the remaining hosts for this loop 46400 1727204620.61052: getting the next task for host managed-node2 46400 1727204620.61067: done getting next task for host managed-node2 46400 1727204620.61072: ^ task is: TASK: Include network role 46400 1727204620.61076: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.61079: getting variables 46400 1727204620.61081: in VariableManager get_vars() 46400 1727204620.61119: Calling all_inventory to load vars for managed-node2 46400 1727204620.61121: Calling groups_inventory to load vars for managed-node2 46400 1727204620.61125: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.61135: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.61138: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.61141: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.62602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.64447: done with get_vars() 46400 1727204620.64484: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.078) 0:01:50.930 ***** 46400 1727204620.64621: entering _queue_task() for managed-node2/include_role 46400 1727204620.64996: worker is 1 (out of 1 available) 46400 1727204620.65009: exiting _queue_task() for managed-node2/include_role 46400 1727204620.65022: done queuing things up, now waiting for results queue to drain 46400 1727204620.65024: waiting for pending results... 46400 1727204620.65366: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204620.65628: in run() - task 0affcd87-79f5-1303-fda8-00000000213e 46400 1727204620.65650: variable 'ansible_search_path' from source: unknown 46400 1727204620.65658: variable 'ansible_search_path' from source: unknown 46400 1727204620.65707: calling self._execute() 46400 1727204620.65822: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.65835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.65849: variable 'omit' from source: magic vars 46400 1727204620.66363: variable 'ansible_distribution_major_version' from source: facts 46400 1727204620.66383: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204620.66393: _execute() done 46400 1727204620.66400: dumping result to json 46400 1727204620.66407: done dumping result, returning 46400 1727204620.66416: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-00000000213e] 46400 1727204620.66428: sending task result for task 0affcd87-79f5-1303-fda8-00000000213e 46400 1727204620.66620: no more pending results, returning what we have 46400 1727204620.66626: in VariableManager get_vars() 46400 1727204620.66685: Calling all_inventory to load vars for managed-node2 46400 1727204620.66689: Calling groups_inventory to load vars for managed-node2 46400 1727204620.66694: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.66710: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.66713: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.66717: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.67826: done sending task result for task 0affcd87-79f5-1303-fda8-00000000213e 46400 1727204620.67830: WORKER PROCESS EXITING 46400 1727204620.68915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.71129: done with get_vars() 46400 1727204620.71167: variable 'ansible_search_path' from source: unknown 46400 1727204620.71168: variable 'ansible_search_path' from source: unknown 46400 1727204620.71450: variable 'omit' from source: magic vars 46400 1727204620.71584: variable 'omit' from source: magic vars 46400 1727204620.71600: variable 'omit' from source: magic vars 46400 1727204620.71604: we have included files to process 46400 1727204620.71605: generating all_blocks data 46400 1727204620.71607: done generating all_blocks data 46400 1727204620.71613: processing included file: fedora.linux_system_roles.network 46400 1727204620.71634: in VariableManager get_vars() 46400 1727204620.71652: done with get_vars() 46400 1727204620.71689: in VariableManager get_vars() 46400 1727204620.71709: done with get_vars() 46400 1727204620.71745: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204620.71888: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204620.71981: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204620.72527: in VariableManager get_vars() 46400 1727204620.72668: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204620.75450: iterating over new_blocks loaded from include file 46400 1727204620.75453: in VariableManager get_vars() 46400 1727204620.75479: done with get_vars() 46400 1727204620.75481: filtering new block on tags 46400 1727204620.75880: done filtering new block on tags 46400 1727204620.75884: in VariableManager get_vars() 46400 1727204620.75902: done with get_vars() 46400 1727204620.75904: filtering new block on tags 46400 1727204620.76005: done filtering new block on tags 46400 1727204620.76008: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204620.76014: extending task lists for all hosts with included blocks 46400 1727204620.76141: done extending task lists 46400 1727204620.76143: done processing included files 46400 1727204620.76143: results queue empty 46400 1727204620.76144: checking for any_errors_fatal 46400 1727204620.76149: done checking for any_errors_fatal 46400 1727204620.76149: checking for max_fail_percentage 46400 1727204620.76151: done checking for max_fail_percentage 46400 1727204620.76151: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.76152: done checking to see if all hosts have failed 46400 1727204620.76153: getting the remaining hosts for this loop 46400 1727204620.76154: done getting the remaining hosts for this loop 46400 1727204620.76157: getting the next task for host managed-node2 46400 1727204620.76167: done getting next task for host managed-node2 46400 1727204620.76170: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204620.76174: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.76185: getting variables 46400 1727204620.76186: in VariableManager get_vars() 46400 1727204620.76200: Calling all_inventory to load vars for managed-node2 46400 1727204620.76202: Calling groups_inventory to load vars for managed-node2 46400 1727204620.76204: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.76436: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.76441: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.76445: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.78241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.80582: done with get_vars() 46400 1727204620.80604: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.161) 0:01:51.092 ***** 46400 1727204620.80809: entering _queue_task() for managed-node2/include_tasks 46400 1727204620.81492: worker is 1 (out of 1 available) 46400 1727204620.81617: exiting _queue_task() for managed-node2/include_tasks 46400 1727204620.81631: done queuing things up, now waiting for results queue to drain 46400 1727204620.81633: waiting for pending results... 46400 1727204620.82069: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204620.82197: in run() - task 0affcd87-79f5-1303-fda8-000000002328 46400 1727204620.82212: variable 'ansible_search_path' from source: unknown 46400 1727204620.82216: variable 'ansible_search_path' from source: unknown 46400 1727204620.82251: calling self._execute() 46400 1727204620.82914: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204620.82921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204620.82930: variable 'omit' from source: magic vars 46400 1727204620.83658: variable 'ansible_distribution_major_version' from source: facts 46400 1727204620.83673: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204620.83679: _execute() done 46400 1727204620.83683: dumping result to json 46400 1727204620.83685: done dumping result, returning 46400 1727204620.83692: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000002328] 46400 1727204620.83699: sending task result for task 0affcd87-79f5-1303-fda8-000000002328 46400 1727204620.83815: done sending task result for task 0affcd87-79f5-1303-fda8-000000002328 46400 1727204620.83819: WORKER PROCESS EXITING 46400 1727204620.83871: no more pending results, returning what we have 46400 1727204620.83876: in VariableManager get_vars() 46400 1727204620.83929: Calling all_inventory to load vars for managed-node2 46400 1727204620.83933: Calling groups_inventory to load vars for managed-node2 46400 1727204620.83935: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.83947: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.83950: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.83952: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.86521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204620.90350: done with get_vars() 46400 1727204620.90390: variable 'ansible_search_path' from source: unknown 46400 1727204620.90507: variable 'ansible_search_path' from source: unknown 46400 1727204620.90551: we have included files to process 46400 1727204620.90553: generating all_blocks data 46400 1727204620.90555: done generating all_blocks data 46400 1727204620.90562: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204620.90565: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204620.90568: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204620.91557: done processing included file 46400 1727204620.91560: iterating over new_blocks loaded from include file 46400 1727204620.91562: in VariableManager get_vars() 46400 1727204620.91593: done with get_vars() 46400 1727204620.91595: filtering new block on tags 46400 1727204620.91628: done filtering new block on tags 46400 1727204620.91631: in VariableManager get_vars() 46400 1727204620.91659: done with get_vars() 46400 1727204620.91661: filtering new block on tags 46400 1727204620.91758: done filtering new block on tags 46400 1727204620.91760: in VariableManager get_vars() 46400 1727204620.91786: done with get_vars() 46400 1727204620.91788: filtering new block on tags 46400 1727204620.91832: done filtering new block on tags 46400 1727204620.91835: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204620.91840: extending task lists for all hosts with included blocks 46400 1727204620.94053: done extending task lists 46400 1727204620.94054: done processing included files 46400 1727204620.94055: results queue empty 46400 1727204620.94056: checking for any_errors_fatal 46400 1727204620.94060: done checking for any_errors_fatal 46400 1727204620.94061: checking for max_fail_percentage 46400 1727204620.94062: done checking for max_fail_percentage 46400 1727204620.94063: checking to see if all hosts have failed and the running result is not ok 46400 1727204620.94189: done checking to see if all hosts have failed 46400 1727204620.94191: getting the remaining hosts for this loop 46400 1727204620.94192: done getting the remaining hosts for this loop 46400 1727204620.94195: getting the next task for host managed-node2 46400 1727204620.94201: done getting next task for host managed-node2 46400 1727204620.94204: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204620.94208: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204620.94220: getting variables 46400 1727204620.94221: in VariableManager get_vars() 46400 1727204620.94238: Calling all_inventory to load vars for managed-node2 46400 1727204620.94241: Calling groups_inventory to load vars for managed-node2 46400 1727204620.94243: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204620.94249: Calling all_plugins_play to load vars for managed-node2 46400 1727204620.94251: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204620.94255: Calling groups_plugins_play to load vars for managed-node2 46400 1727204620.96989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204621.00184: done with get_vars() 46400 1727204621.00220: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.205) 0:01:51.297 ***** 46400 1727204621.01334: entering _queue_task() for managed-node2/setup 46400 1727204621.01679: worker is 1 (out of 1 available) 46400 1727204621.01694: exiting _queue_task() for managed-node2/setup 46400 1727204621.01709: done queuing things up, now waiting for results queue to drain 46400 1727204621.01710: waiting for pending results... 46400 1727204621.02931: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204621.03350: in run() - task 0affcd87-79f5-1303-fda8-00000000237f 46400 1727204621.03375: variable 'ansible_search_path' from source: unknown 46400 1727204621.03383: variable 'ansible_search_path' from source: unknown 46400 1727204621.03424: calling self._execute() 46400 1727204621.03647: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.03659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.03795: variable 'omit' from source: magic vars 46400 1727204621.04610: variable 'ansible_distribution_major_version' from source: facts 46400 1727204621.04779: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204621.05245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204621.09442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204621.09517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204621.09567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204621.09607: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204621.09642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204621.09771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204621.09806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204621.09840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204621.09888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204621.09909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204621.09969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204621.09999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204621.10030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204621.10079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204621.10100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204621.10266: variable '__network_required_facts' from source: role '' defaults 46400 1727204621.10347: variable 'ansible_facts' from source: unknown 46400 1727204621.11160: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204621.11173: when evaluation is False, skipping this task 46400 1727204621.11181: _execute() done 46400 1727204621.11188: dumping result to json 46400 1727204621.11197: done dumping result, returning 46400 1727204621.11209: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-00000000237f] 46400 1727204621.11219: sending task result for task 0affcd87-79f5-1303-fda8-00000000237f skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204621.11371: no more pending results, returning what we have 46400 1727204621.11376: results queue empty 46400 1727204621.11377: checking for any_errors_fatal 46400 1727204621.11379: done checking for any_errors_fatal 46400 1727204621.11379: checking for max_fail_percentage 46400 1727204621.11381: done checking for max_fail_percentage 46400 1727204621.11382: checking to see if all hosts have failed and the running result is not ok 46400 1727204621.11383: done checking to see if all hosts have failed 46400 1727204621.11383: getting the remaining hosts for this loop 46400 1727204621.11385: done getting the remaining hosts for this loop 46400 1727204621.11389: getting the next task for host managed-node2 46400 1727204621.11399: done getting next task for host managed-node2 46400 1727204621.11403: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204621.11409: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204621.11438: getting variables 46400 1727204621.11440: in VariableManager get_vars() 46400 1727204621.11498: Calling all_inventory to load vars for managed-node2 46400 1727204621.11501: Calling groups_inventory to load vars for managed-node2 46400 1727204621.11503: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204621.11516: Calling all_plugins_play to load vars for managed-node2 46400 1727204621.11519: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204621.11523: Calling groups_plugins_play to load vars for managed-node2 46400 1727204621.12045: done sending task result for task 0affcd87-79f5-1303-fda8-00000000237f 46400 1727204621.12055: WORKER PROCESS EXITING 46400 1727204621.16141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204621.22858: done with get_vars() 46400 1727204621.22899: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.220) 0:01:51.518 ***** 46400 1727204621.23401: entering _queue_task() for managed-node2/stat 46400 1727204621.24440: worker is 1 (out of 1 available) 46400 1727204621.24455: exiting _queue_task() for managed-node2/stat 46400 1727204621.24698: done queuing things up, now waiting for results queue to drain 46400 1727204621.24701: waiting for pending results... 46400 1727204621.25285: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204621.25532: in run() - task 0affcd87-79f5-1303-fda8-000000002381 46400 1727204621.25797: variable 'ansible_search_path' from source: unknown 46400 1727204621.25807: variable 'ansible_search_path' from source: unknown 46400 1727204621.25859: calling self._execute() 46400 1727204621.26189: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.26201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.26513: variable 'omit' from source: magic vars 46400 1727204621.26890: variable 'ansible_distribution_major_version' from source: facts 46400 1727204621.27988: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204621.28174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204621.28845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204621.29514: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204621.29551: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204621.29591: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204621.29684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204621.29714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204621.29745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204621.29779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204621.29875: variable '__network_is_ostree' from source: set_fact 46400 1727204621.29887: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204621.29895: when evaluation is False, skipping this task 46400 1727204621.29900: _execute() done 46400 1727204621.29906: dumping result to json 46400 1727204621.29916: done dumping result, returning 46400 1727204621.29928: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-000000002381] 46400 1727204621.29938: sending task result for task 0affcd87-79f5-1303-fda8-000000002381 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204621.30102: no more pending results, returning what we have 46400 1727204621.30106: results queue empty 46400 1727204621.30107: checking for any_errors_fatal 46400 1727204621.30117: done checking for any_errors_fatal 46400 1727204621.30118: checking for max_fail_percentage 46400 1727204621.30119: done checking for max_fail_percentage 46400 1727204621.30120: checking to see if all hosts have failed and the running result is not ok 46400 1727204621.30121: done checking to see if all hosts have failed 46400 1727204621.30122: getting the remaining hosts for this loop 46400 1727204621.30123: done getting the remaining hosts for this loop 46400 1727204621.30127: getting the next task for host managed-node2 46400 1727204621.30137: done getting next task for host managed-node2 46400 1727204621.30142: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204621.30148: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204621.30189: getting variables 46400 1727204621.30191: in VariableManager get_vars() 46400 1727204621.30242: Calling all_inventory to load vars for managed-node2 46400 1727204621.30245: Calling groups_inventory to load vars for managed-node2 46400 1727204621.30247: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204621.30262: Calling all_plugins_play to load vars for managed-node2 46400 1727204621.30268: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204621.30272: Calling groups_plugins_play to load vars for managed-node2 46400 1727204621.31821: done sending task result for task 0affcd87-79f5-1303-fda8-000000002381 46400 1727204621.31824: WORKER PROCESS EXITING 46400 1727204621.33477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204621.39062: done with get_vars() 46400 1727204621.39110: done getting variables 46400 1727204621.39692: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.163) 0:01:51.681 ***** 46400 1727204621.39734: entering _queue_task() for managed-node2/set_fact 46400 1727204621.40114: worker is 1 (out of 1 available) 46400 1727204621.40128: exiting _queue_task() for managed-node2/set_fact 46400 1727204621.40140: done queuing things up, now waiting for results queue to drain 46400 1727204621.40142: waiting for pending results... 46400 1727204621.41358: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204621.41840: in run() - task 0affcd87-79f5-1303-fda8-000000002382 46400 1727204621.41850: variable 'ansible_search_path' from source: unknown 46400 1727204621.41854: variable 'ansible_search_path' from source: unknown 46400 1727204621.41993: calling self._execute() 46400 1727204621.42123: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.42127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.42139: variable 'omit' from source: magic vars 46400 1727204621.43027: variable 'ansible_distribution_major_version' from source: facts 46400 1727204621.43040: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204621.43538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204621.44115: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204621.44272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204621.44310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204621.44342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204621.44578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204621.44681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204621.44722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204621.44750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204621.45076: variable '__network_is_ostree' from source: set_fact 46400 1727204621.45084: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204621.45088: when evaluation is False, skipping this task 46400 1727204621.45090: _execute() done 46400 1727204621.45093: dumping result to json 46400 1727204621.45095: done dumping result, returning 46400 1727204621.45105: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-000000002382] 46400 1727204621.45111: sending task result for task 0affcd87-79f5-1303-fda8-000000002382 46400 1727204621.45330: done sending task result for task 0affcd87-79f5-1303-fda8-000000002382 46400 1727204621.45333: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204621.45384: no more pending results, returning what we have 46400 1727204621.45389: results queue empty 46400 1727204621.45391: checking for any_errors_fatal 46400 1727204621.45400: done checking for any_errors_fatal 46400 1727204621.45401: checking for max_fail_percentage 46400 1727204621.45402: done checking for max_fail_percentage 46400 1727204621.45404: checking to see if all hosts have failed and the running result is not ok 46400 1727204621.45405: done checking to see if all hosts have failed 46400 1727204621.45405: getting the remaining hosts for this loop 46400 1727204621.45407: done getting the remaining hosts for this loop 46400 1727204621.45411: getting the next task for host managed-node2 46400 1727204621.45425: done getting next task for host managed-node2 46400 1727204621.45429: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204621.45436: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204621.45469: getting variables 46400 1727204621.45471: in VariableManager get_vars() 46400 1727204621.45520: Calling all_inventory to load vars for managed-node2 46400 1727204621.45523: Calling groups_inventory to load vars for managed-node2 46400 1727204621.45525: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204621.45536: Calling all_plugins_play to load vars for managed-node2 46400 1727204621.45538: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204621.45541: Calling groups_plugins_play to load vars for managed-node2 46400 1727204621.48288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204621.54003: done with get_vars() 46400 1727204621.54070: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.145) 0:01:51.827 ***** 46400 1727204621.54302: entering _queue_task() for managed-node2/service_facts 46400 1727204621.55569: worker is 1 (out of 1 available) 46400 1727204621.55583: exiting _queue_task() for managed-node2/service_facts 46400 1727204621.55603: done queuing things up, now waiting for results queue to drain 46400 1727204621.55605: waiting for pending results... 46400 1727204621.56663: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204621.57192: in run() - task 0affcd87-79f5-1303-fda8-000000002384 46400 1727204621.57207: variable 'ansible_search_path' from source: unknown 46400 1727204621.57212: variable 'ansible_search_path' from source: unknown 46400 1727204621.57380: calling self._execute() 46400 1727204621.57600: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.57605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.57615: variable 'omit' from source: magic vars 46400 1727204621.58526: variable 'ansible_distribution_major_version' from source: facts 46400 1727204621.58538: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204621.58544: variable 'omit' from source: magic vars 46400 1727204621.58757: variable 'omit' from source: magic vars 46400 1727204621.58920: variable 'omit' from source: magic vars 46400 1727204621.58962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204621.59170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204621.59174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204621.59177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204621.59193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204621.59459: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204621.59467: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.59470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.59695: Set connection var ansible_shell_type to sh 46400 1727204621.59705: Set connection var ansible_shell_executable to /bin/sh 46400 1727204621.59710: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204621.59716: Set connection var ansible_connection to ssh 46400 1727204621.59722: Set connection var ansible_pipelining to False 46400 1727204621.59727: Set connection var ansible_timeout to 10 46400 1727204621.60007: variable 'ansible_shell_executable' from source: unknown 46400 1727204621.60017: variable 'ansible_connection' from source: unknown 46400 1727204621.60021: variable 'ansible_module_compression' from source: unknown 46400 1727204621.60023: variable 'ansible_shell_type' from source: unknown 46400 1727204621.60025: variable 'ansible_shell_executable' from source: unknown 46400 1727204621.60028: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204621.60034: variable 'ansible_pipelining' from source: unknown 46400 1727204621.60036: variable 'ansible_timeout' from source: unknown 46400 1727204621.60041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204621.60495: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204621.60505: variable 'omit' from source: magic vars 46400 1727204621.60510: starting attempt loop 46400 1727204621.60513: running the handler 46400 1727204621.60527: _low_level_execute_command(): starting 46400 1727204621.60534: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204621.62869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204621.62882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.62893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.62910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.62983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.63035: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204621.63045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.63069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204621.63079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204621.63086: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204621.63094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.63104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.63117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.63144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.63151: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204621.63172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.63245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204621.63431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204621.63442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204621.63594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204621.65303: stdout chunk (state=3): >>>/root <<< 46400 1727204621.65388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204621.65393: stdout chunk (state=3): >>><<< 46400 1727204621.65402: stderr chunk (state=3): >>><<< 46400 1727204621.65429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204621.65444: _low_level_execute_command(): starting 46400 1727204621.65450: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527 `" && echo ansible-tmp-1727204621.6542907-54211-57062586853527="` echo /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527 `" ) && sleep 0' 46400 1727204621.67074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.67080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.67210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204621.67214: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204621.67383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.67458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204621.67483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204621.67500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204621.67577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204621.69434: stdout chunk (state=3): >>>ansible-tmp-1727204621.6542907-54211-57062586853527=/root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527 <<< 46400 1727204621.69633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204621.69637: stdout chunk (state=3): >>><<< 46400 1727204621.69641: stderr chunk (state=3): >>><<< 46400 1727204621.69671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204621.6542907-54211-57062586853527=/root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204621.69870: variable 'ansible_module_compression' from source: unknown 46400 1727204621.69873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204621.69876: variable 'ansible_facts' from source: unknown 46400 1727204621.69878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/AnsiballZ_service_facts.py 46400 1727204621.70492: Sending initial data 46400 1727204621.70496: Sent initial data (161 bytes) 46400 1727204621.72939: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204621.72959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.72977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.72996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.73039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.73053: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204621.73071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.73185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204621.73199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204621.73211: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204621.73224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.73240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.73257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.73273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.73286: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204621.73301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.73380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204621.73404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204621.73587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204621.73658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204621.75370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204621.75400: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204621.75440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmplobk0hx1 /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/AnsiballZ_service_facts.py <<< 46400 1727204621.75480: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204621.76967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204621.77057: stderr chunk (state=3): >>><<< 46400 1727204621.77063: stdout chunk (state=3): >>><<< 46400 1727204621.77084: done transferring module to remote 46400 1727204621.77095: _low_level_execute_command(): starting 46400 1727204621.77100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/ /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/AnsiballZ_service_facts.py && sleep 0' 46400 1727204621.78753: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204621.78766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.78777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.78791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.78842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.78850: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204621.78863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.78930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204621.78940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204621.78948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204621.78956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.78967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.78980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.78988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.78995: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204621.79005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.79184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204621.79198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204621.79208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204621.79366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204621.81084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204621.81117: stderr chunk (state=3): >>><<< 46400 1727204621.81120: stdout chunk (state=3): >>><<< 46400 1727204621.81136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204621.81139: _low_level_execute_command(): starting 46400 1727204621.81145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/AnsiballZ_service_facts.py && sleep 0' 46400 1727204621.81834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204621.81842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.81852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.81868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.81915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.81922: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204621.81932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.81945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204621.81953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204621.81962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204621.81975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204621.81991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204621.82010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204621.82017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204621.82024: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204621.82034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204621.82117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204621.82134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204621.82146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204621.82224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.15073: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204623.15099: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 46400 1727204623.15133: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hiber<<< 46400 1727204623.15154: stdout chunk (state=3): >>>nate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204623.16497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204623.16516: stderr chunk (state=3): >>><<< 46400 1727204623.16531: stdout chunk (state=3): >>><<< 46400 1727204623.16553: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204623.18950: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204623.18959: _low_level_execute_command(): starting 46400 1727204623.19076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204621.6542907-54211-57062586853527/ > /dev/null 2>&1 && sleep 0' 46400 1727204623.21399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.21481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.21487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204623.21586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.21617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.21635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.21650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.21661: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204623.21679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.21762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.21783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.21800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.21886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.23801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204623.23805: stdout chunk (state=3): >>><<< 46400 1727204623.23807: stderr chunk (state=3): >>><<< 46400 1727204623.23973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204623.23977: handler run complete 46400 1727204623.24041: variable 'ansible_facts' from source: unknown 46400 1727204623.24237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204623.25087: variable 'ansible_facts' from source: unknown 46400 1727204623.25243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204623.25473: attempt loop complete, returning result 46400 1727204623.25484: _execute() done 46400 1727204623.25491: dumping result to json 46400 1727204623.25576: done dumping result, returning 46400 1727204623.25650: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000002384] 46400 1727204623.25662: sending task result for task 0affcd87-79f5-1303-fda8-000000002384 46400 1727204623.28274: done sending task result for task 0affcd87-79f5-1303-fda8-000000002384 46400 1727204623.28284: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204623.28422: no more pending results, returning what we have 46400 1727204623.28427: results queue empty 46400 1727204623.28428: checking for any_errors_fatal 46400 1727204623.28434: done checking for any_errors_fatal 46400 1727204623.28435: checking for max_fail_percentage 46400 1727204623.28436: done checking for max_fail_percentage 46400 1727204623.28438: checking to see if all hosts have failed and the running result is not ok 46400 1727204623.28439: done checking to see if all hosts have failed 46400 1727204623.28439: getting the remaining hosts for this loop 46400 1727204623.28441: done getting the remaining hosts for this loop 46400 1727204623.28445: getting the next task for host managed-node2 46400 1727204623.28453: done getting next task for host managed-node2 46400 1727204623.28457: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204623.28488: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204623.28504: getting variables 46400 1727204623.28505: in VariableManager get_vars() 46400 1727204623.28546: Calling all_inventory to load vars for managed-node2 46400 1727204623.28548: Calling groups_inventory to load vars for managed-node2 46400 1727204623.28550: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204623.28570: Calling all_plugins_play to load vars for managed-node2 46400 1727204623.28573: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204623.28582: Calling groups_plugins_play to load vars for managed-node2 46400 1727204623.31212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204623.36480: done with get_vars() 46400 1727204623.36519: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:43 -0400 (0:00:01.823) 0:01:53.650 ***** 46400 1727204623.36643: entering _queue_task() for managed-node2/package_facts 46400 1727204623.37072: worker is 1 (out of 1 available) 46400 1727204623.37085: exiting _queue_task() for managed-node2/package_facts 46400 1727204623.37098: done queuing things up, now waiting for results queue to drain 46400 1727204623.37100: waiting for pending results... 46400 1727204623.37458: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204623.38072: in run() - task 0affcd87-79f5-1303-fda8-000000002385 46400 1727204623.38076: variable 'ansible_search_path' from source: unknown 46400 1727204623.38079: variable 'ansible_search_path' from source: unknown 46400 1727204623.38082: calling self._execute() 46400 1727204623.38084: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204623.38087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204623.38090: variable 'omit' from source: magic vars 46400 1727204623.39368: variable 'ansible_distribution_major_version' from source: facts 46400 1727204623.39378: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204623.39385: variable 'omit' from source: magic vars 46400 1727204623.39583: variable 'omit' from source: magic vars 46400 1727204623.39616: variable 'omit' from source: magic vars 46400 1727204623.39782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204623.39817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204623.39838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204623.39854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204623.39868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204623.40013: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204623.40017: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204623.40019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204623.40234: Set connection var ansible_shell_type to sh 46400 1727204623.40244: Set connection var ansible_shell_executable to /bin/sh 46400 1727204623.40250: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204623.40255: Set connection var ansible_connection to ssh 46400 1727204623.40260: Set connection var ansible_pipelining to False 46400 1727204623.40270: Set connection var ansible_timeout to 10 46400 1727204623.40297: variable 'ansible_shell_executable' from source: unknown 46400 1727204623.40300: variable 'ansible_connection' from source: unknown 46400 1727204623.40303: variable 'ansible_module_compression' from source: unknown 46400 1727204623.40305: variable 'ansible_shell_type' from source: unknown 46400 1727204623.40307: variable 'ansible_shell_executable' from source: unknown 46400 1727204623.40309: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204623.40434: variable 'ansible_pipelining' from source: unknown 46400 1727204623.40438: variable 'ansible_timeout' from source: unknown 46400 1727204623.40441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204623.40887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204623.40899: variable 'omit' from source: magic vars 46400 1727204623.40903: starting attempt loop 46400 1727204623.40906: running the handler 46400 1727204623.40920: _low_level_execute_command(): starting 46400 1727204623.40927: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204623.42524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.42534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.42800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204623.42805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.42822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.42829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.43013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.43020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.43037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.43202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.44768: stdout chunk (state=3): >>>/root <<< 46400 1727204623.44971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204623.44976: stdout chunk (state=3): >>><<< 46400 1727204623.44979: stderr chunk (state=3): >>><<< 46400 1727204623.45103: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204623.45106: _low_level_execute_command(): starting 46400 1727204623.45110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669 `" && echo ansible-tmp-1727204623.4500005-54306-268714594087669="` echo /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669 `" ) && sleep 0' 46400 1727204623.46575: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204623.46687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.46708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.46732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.46784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.46840: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204623.46855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.46877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204623.46888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204623.46898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204623.46909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.46922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.46942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.46955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.46968: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204623.46983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.47124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.47174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.47190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.47385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.49214: stdout chunk (state=3): >>>ansible-tmp-1727204623.4500005-54306-268714594087669=/root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669 <<< 46400 1727204623.49436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204623.49439: stdout chunk (state=3): >>><<< 46400 1727204623.49441: stderr chunk (state=3): >>><<< 46400 1727204623.49471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204623.4500005-54306-268714594087669=/root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204623.49675: variable 'ansible_module_compression' from source: unknown 46400 1727204623.49679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204623.49681: variable 'ansible_facts' from source: unknown 46400 1727204623.49862: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/AnsiballZ_package_facts.py 46400 1727204623.50166: Sending initial data 46400 1727204623.50169: Sent initial data (162 bytes) 46400 1727204623.51401: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204623.51418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.51442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.51461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.51506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.51519: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204623.51543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.51562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204623.51577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204623.51588: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204623.51600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.51615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.51660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.51676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.51687: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204623.51699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.51785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.51808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.51825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.51942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.53690: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204623.53744: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204623.53771: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpt5ks47mp /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/AnsiballZ_package_facts.py <<< 46400 1727204623.53783: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204623.56915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204623.57038: stderr chunk (state=3): >>><<< 46400 1727204623.57041: stdout chunk (state=3): >>><<< 46400 1727204623.57044: done transferring module to remote 46400 1727204623.57047: _low_level_execute_command(): starting 46400 1727204623.57050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/ /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/AnsiballZ_package_facts.py && sleep 0' 46400 1727204623.59020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204623.59087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.59147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.59172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.59209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.59217: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204623.59227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.59243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204623.59274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204623.59283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204623.59292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.59300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.59333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204623.59339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.59441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.59447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.59475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.59538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204623.61383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204623.61387: stderr chunk (state=3): >>><<< 46400 1727204623.61389: stdout chunk (state=3): >>><<< 46400 1727204623.61403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204623.61406: _low_level_execute_command(): starting 46400 1727204623.61412: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/AnsiballZ_package_facts.py && sleep 0' 46400 1727204623.62994: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204623.63010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.63025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.63044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.63102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.63115: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204623.63130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.63147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204623.63159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204623.63177: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204623.63198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204623.63213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204623.63230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204623.63243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204623.63255: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204623.63271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204623.63352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204623.63376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204623.63392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204623.63480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204624.09885: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204624.10016: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204624.10037: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204624.11586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204624.11658: stderr chunk (state=3): >>><<< 46400 1727204624.11666: stdout chunk (state=3): >>><<< 46400 1727204624.11766: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204624.15940: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204624.15978: _low_level_execute_command(): starting 46400 1727204624.15989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204623.4500005-54306-268714594087669/ > /dev/null 2>&1 && sleep 0' 46400 1727204624.16754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204624.16783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204624.16799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204624.16815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204624.16857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204624.16873: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204624.16892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204624.16907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204624.16917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204624.16925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204624.16935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204624.16946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204624.16963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204624.16980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204624.16990: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204624.17009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204624.17084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204624.17118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204624.17137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204624.17217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204624.19172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204624.19175: stdout chunk (state=3): >>><<< 46400 1727204624.19178: stderr chunk (state=3): >>><<< 46400 1727204624.19376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204624.19380: handler run complete 46400 1727204624.20725: variable 'ansible_facts' from source: unknown 46400 1727204624.22925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.25909: variable 'ansible_facts' from source: unknown 46400 1727204624.26355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.27142: attempt loop complete, returning result 46400 1727204624.27154: _execute() done 46400 1727204624.27157: dumping result to json 46400 1727204624.27468: done dumping result, returning 46400 1727204624.27478: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000002385] 46400 1727204624.27484: sending task result for task 0affcd87-79f5-1303-fda8-000000002385 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204624.49669: no more pending results, returning what we have 46400 1727204624.49673: results queue empty 46400 1727204624.49674: checking for any_errors_fatal 46400 1727204624.49679: done checking for any_errors_fatal 46400 1727204624.49680: checking for max_fail_percentage 46400 1727204624.49682: done checking for max_fail_percentage 46400 1727204624.49682: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.49683: done checking to see if all hosts have failed 46400 1727204624.49684: getting the remaining hosts for this loop 46400 1727204624.49685: done getting the remaining hosts for this loop 46400 1727204624.49689: getting the next task for host managed-node2 46400 1727204624.49698: done getting next task for host managed-node2 46400 1727204624.49702: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204624.49708: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.49721: getting variables 46400 1727204624.49723: in VariableManager get_vars() 46400 1727204624.49758: Calling all_inventory to load vars for managed-node2 46400 1727204624.49769: Calling groups_inventory to load vars for managed-node2 46400 1727204624.49772: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.49782: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.49790: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.49793: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.50867: done sending task result for task 0affcd87-79f5-1303-fda8-000000002385 46400 1727204624.50872: WORKER PROCESS EXITING 46400 1727204624.51854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.53796: done with get_vars() 46400 1727204624.54537: done getting variables 46400 1727204624.54606: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:44 -0400 (0:00:01.181) 0:01:54.832 ***** 46400 1727204624.54763: entering _queue_task() for managed-node2/debug 46400 1727204624.55301: worker is 1 (out of 1 available) 46400 1727204624.55315: exiting _queue_task() for managed-node2/debug 46400 1727204624.55328: done queuing things up, now waiting for results queue to drain 46400 1727204624.55330: waiting for pending results... 46400 1727204624.55686: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204624.55851: in run() - task 0affcd87-79f5-1303-fda8-000000002329 46400 1727204624.55882: variable 'ansible_search_path' from source: unknown 46400 1727204624.55892: variable 'ansible_search_path' from source: unknown 46400 1727204624.55942: calling self._execute() 46400 1727204624.56071: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.56084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.56105: variable 'omit' from source: magic vars 46400 1727204624.56596: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.56615: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.56627: variable 'omit' from source: magic vars 46400 1727204624.56718: variable 'omit' from source: magic vars 46400 1727204624.56837: variable 'network_provider' from source: set_fact 46400 1727204624.56872: variable 'omit' from source: magic vars 46400 1727204624.56930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204624.56979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204624.57009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204624.57038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204624.57054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204624.57096: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204624.57105: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.57113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.57225: Set connection var ansible_shell_type to sh 46400 1727204624.57248: Set connection var ansible_shell_executable to /bin/sh 46400 1727204624.57257: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204624.57271: Set connection var ansible_connection to ssh 46400 1727204624.57283: Set connection var ansible_pipelining to False 46400 1727204624.57298: Set connection var ansible_timeout to 10 46400 1727204624.57328: variable 'ansible_shell_executable' from source: unknown 46400 1727204624.57337: variable 'ansible_connection' from source: unknown 46400 1727204624.57350: variable 'ansible_module_compression' from source: unknown 46400 1727204624.57363: variable 'ansible_shell_type' from source: unknown 46400 1727204624.57373: variable 'ansible_shell_executable' from source: unknown 46400 1727204624.57380: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.57388: variable 'ansible_pipelining' from source: unknown 46400 1727204624.57394: variable 'ansible_timeout' from source: unknown 46400 1727204624.57405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.57572: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204624.57596: variable 'omit' from source: magic vars 46400 1727204624.57606: starting attempt loop 46400 1727204624.57613: running the handler 46400 1727204624.57671: handler run complete 46400 1727204624.57696: attempt loop complete, returning result 46400 1727204624.57703: _execute() done 46400 1727204624.57710: dumping result to json 46400 1727204624.57717: done dumping result, returning 46400 1727204624.57732: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000002329] 46400 1727204624.57743: sending task result for task 0affcd87-79f5-1303-fda8-000000002329 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204624.58143: no more pending results, returning what we have 46400 1727204624.58147: results queue empty 46400 1727204624.58149: checking for any_errors_fatal 46400 1727204624.58166: done checking for any_errors_fatal 46400 1727204624.58167: checking for max_fail_percentage 46400 1727204624.58169: done checking for max_fail_percentage 46400 1727204624.58171: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.58171: done checking to see if all hosts have failed 46400 1727204624.58172: getting the remaining hosts for this loop 46400 1727204624.58175: done getting the remaining hosts for this loop 46400 1727204624.58180: getting the next task for host managed-node2 46400 1727204624.58194: done getting next task for host managed-node2 46400 1727204624.58199: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204624.58206: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.58223: getting variables 46400 1727204624.58225: in VariableManager get_vars() 46400 1727204624.58285: Calling all_inventory to load vars for managed-node2 46400 1727204624.58288: Calling groups_inventory to load vars for managed-node2 46400 1727204624.58290: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.58302: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.58304: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.58307: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.67576: done sending task result for task 0affcd87-79f5-1303-fda8-000000002329 46400 1727204624.67580: WORKER PROCESS EXITING 46400 1727204624.68125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.69088: done with get_vars() 46400 1727204624.69114: done getting variables 46400 1727204624.69166: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.144) 0:01:54.976 ***** 46400 1727204624.69206: entering _queue_task() for managed-node2/fail 46400 1727204624.69582: worker is 1 (out of 1 available) 46400 1727204624.69595: exiting _queue_task() for managed-node2/fail 46400 1727204624.69608: done queuing things up, now waiting for results queue to drain 46400 1727204624.69610: waiting for pending results... 46400 1727204624.69916: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204624.70088: in run() - task 0affcd87-79f5-1303-fda8-00000000232a 46400 1727204624.70104: variable 'ansible_search_path' from source: unknown 46400 1727204624.70108: variable 'ansible_search_path' from source: unknown 46400 1727204624.70149: calling self._execute() 46400 1727204624.70252: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.70257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.70263: variable 'omit' from source: magic vars 46400 1727204624.70657: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.70671: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.70771: variable 'network_state' from source: role '' defaults 46400 1727204624.70779: Evaluated conditional (network_state != {}): False 46400 1727204624.70782: when evaluation is False, skipping this task 46400 1727204624.70785: _execute() done 46400 1727204624.70788: dumping result to json 46400 1727204624.70791: done dumping result, returning 46400 1727204624.70801: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-00000000232a] 46400 1727204624.70804: sending task result for task 0affcd87-79f5-1303-fda8-00000000232a 46400 1727204624.70899: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232a 46400 1727204624.70903: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204624.70977: no more pending results, returning what we have 46400 1727204624.70982: results queue empty 46400 1727204624.70983: checking for any_errors_fatal 46400 1727204624.70994: done checking for any_errors_fatal 46400 1727204624.70995: checking for max_fail_percentage 46400 1727204624.70997: done checking for max_fail_percentage 46400 1727204624.70998: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.70999: done checking to see if all hosts have failed 46400 1727204624.71000: getting the remaining hosts for this loop 46400 1727204624.71001: done getting the remaining hosts for this loop 46400 1727204624.71005: getting the next task for host managed-node2 46400 1727204624.71015: done getting next task for host managed-node2 46400 1727204624.71020: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204624.71024: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.71047: getting variables 46400 1727204624.71049: in VariableManager get_vars() 46400 1727204624.71094: Calling all_inventory to load vars for managed-node2 46400 1727204624.71097: Calling groups_inventory to load vars for managed-node2 46400 1727204624.71099: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.71109: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.71111: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.71114: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.72070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.73023: done with get_vars() 46400 1727204624.73045: done getting variables 46400 1727204624.73094: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.039) 0:01:55.015 ***** 46400 1727204624.73125: entering _queue_task() for managed-node2/fail 46400 1727204624.73542: worker is 1 (out of 1 available) 46400 1727204624.73555: exiting _queue_task() for managed-node2/fail 46400 1727204624.73570: done queuing things up, now waiting for results queue to drain 46400 1727204624.73572: waiting for pending results... 46400 1727204624.73762: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204624.73933: in run() - task 0affcd87-79f5-1303-fda8-00000000232b 46400 1727204624.73957: variable 'ansible_search_path' from source: unknown 46400 1727204624.73969: variable 'ansible_search_path' from source: unknown 46400 1727204624.74016: calling self._execute() 46400 1727204624.74131: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.74144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.74159: variable 'omit' from source: magic vars 46400 1727204624.74574: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.74591: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.74722: variable 'network_state' from source: role '' defaults 46400 1727204624.74739: Evaluated conditional (network_state != {}): False 46400 1727204624.74750: when evaluation is False, skipping this task 46400 1727204624.74760: _execute() done 46400 1727204624.74771: dumping result to json 46400 1727204624.74779: done dumping result, returning 46400 1727204624.74791: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-00000000232b] 46400 1727204624.74803: sending task result for task 0affcd87-79f5-1303-fda8-00000000232b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204624.74961: no more pending results, returning what we have 46400 1727204624.74967: results queue empty 46400 1727204624.74968: checking for any_errors_fatal 46400 1727204624.74977: done checking for any_errors_fatal 46400 1727204624.74978: checking for max_fail_percentage 46400 1727204624.74980: done checking for max_fail_percentage 46400 1727204624.74981: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.74982: done checking to see if all hosts have failed 46400 1727204624.74983: getting the remaining hosts for this loop 46400 1727204624.74985: done getting the remaining hosts for this loop 46400 1727204624.74989: getting the next task for host managed-node2 46400 1727204624.75000: done getting next task for host managed-node2 46400 1727204624.75005: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204624.75012: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.75041: getting variables 46400 1727204624.75043: in VariableManager get_vars() 46400 1727204624.75095: Calling all_inventory to load vars for managed-node2 46400 1727204624.75099: Calling groups_inventory to load vars for managed-node2 46400 1727204624.75102: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.75115: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.75118: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.75121: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.76084: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232b 46400 1727204624.76088: WORKER PROCESS EXITING 46400 1727204624.76445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.77394: done with get_vars() 46400 1727204624.77414: done getting variables 46400 1727204624.77457: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.043) 0:01:55.059 ***** 46400 1727204624.77487: entering _queue_task() for managed-node2/fail 46400 1727204624.77727: worker is 1 (out of 1 available) 46400 1727204624.77739: exiting _queue_task() for managed-node2/fail 46400 1727204624.77753: done queuing things up, now waiting for results queue to drain 46400 1727204624.77755: waiting for pending results... 46400 1727204624.77952: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204624.78084: in run() - task 0affcd87-79f5-1303-fda8-00000000232c 46400 1727204624.78091: variable 'ansible_search_path' from source: unknown 46400 1727204624.78094: variable 'ansible_search_path' from source: unknown 46400 1727204624.78123: calling self._execute() 46400 1727204624.78211: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.78216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.78225: variable 'omit' from source: magic vars 46400 1727204624.78890: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.78894: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.78898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204624.81839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204624.81923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204624.81996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204624.82040: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204624.82085: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204624.82184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204624.82223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204624.82255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.82316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204624.82343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204624.82480: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.82510: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204624.82518: when evaluation is False, skipping this task 46400 1727204624.82524: _execute() done 46400 1727204624.82531: dumping result to json 46400 1727204624.82537: done dumping result, returning 46400 1727204624.82553: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-00000000232c] 46400 1727204624.82570: sending task result for task 0affcd87-79f5-1303-fda8-00000000232c skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204624.82737: no more pending results, returning what we have 46400 1727204624.82741: results queue empty 46400 1727204624.82742: checking for any_errors_fatal 46400 1727204624.82749: done checking for any_errors_fatal 46400 1727204624.82750: checking for max_fail_percentage 46400 1727204624.82752: done checking for max_fail_percentage 46400 1727204624.82753: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.82754: done checking to see if all hosts have failed 46400 1727204624.82755: getting the remaining hosts for this loop 46400 1727204624.82757: done getting the remaining hosts for this loop 46400 1727204624.82766: getting the next task for host managed-node2 46400 1727204624.82776: done getting next task for host managed-node2 46400 1727204624.82781: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204624.82787: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.82814: getting variables 46400 1727204624.82816: in VariableManager get_vars() 46400 1727204624.82884: Calling all_inventory to load vars for managed-node2 46400 1727204624.82887: Calling groups_inventory to load vars for managed-node2 46400 1727204624.82890: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.82901: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.82904: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.82907: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.83945: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232c 46400 1727204624.83948: WORKER PROCESS EXITING 46400 1727204624.84519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.85450: done with get_vars() 46400 1727204624.85474: done getting variables 46400 1727204624.85550: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.080) 0:01:55.140 ***** 46400 1727204624.85591: entering _queue_task() for managed-node2/dnf 46400 1727204624.85953: worker is 1 (out of 1 available) 46400 1727204624.85972: exiting _queue_task() for managed-node2/dnf 46400 1727204624.85984: done queuing things up, now waiting for results queue to drain 46400 1727204624.85986: waiting for pending results... 46400 1727204624.86289: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204624.86573: in run() - task 0affcd87-79f5-1303-fda8-00000000232d 46400 1727204624.86595: variable 'ansible_search_path' from source: unknown 46400 1727204624.86604: variable 'ansible_search_path' from source: unknown 46400 1727204624.86766: calling self._execute() 46400 1727204624.86979: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.86990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.87005: variable 'omit' from source: magic vars 46400 1727204624.87855: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.87879: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.88123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204624.90726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204624.90815: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204624.90863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204624.90913: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204624.90946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204624.91035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204624.91072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204624.91103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.91149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204624.91172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204624.91308: variable 'ansible_distribution' from source: facts 46400 1727204624.91318: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.91343: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204624.91477: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204624.91619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204624.91654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204624.91688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.91732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204624.91749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204624.91802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204624.91831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204624.91859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.91910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204624.91929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204624.91977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204624.92008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204624.92037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.92087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204624.92107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204624.92277: variable 'network_connections' from source: include params 46400 1727204624.92292: variable 'interface' from source: play vars 46400 1727204624.92366: variable 'interface' from source: play vars 46400 1727204624.92446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204624.92643: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204624.92691: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204624.92726: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204624.92768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204624.92815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204624.92842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204624.92888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204624.92917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204624.92971: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204624.93230: variable 'network_connections' from source: include params 46400 1727204624.93241: variable 'interface' from source: play vars 46400 1727204624.93312: variable 'interface' from source: play vars 46400 1727204624.93339: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204624.93347: when evaluation is False, skipping this task 46400 1727204624.93353: _execute() done 46400 1727204624.93362: dumping result to json 46400 1727204624.93371: done dumping result, returning 46400 1727204624.93383: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000232d] 46400 1727204624.93392: sending task result for task 0affcd87-79f5-1303-fda8-00000000232d skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204624.93562: no more pending results, returning what we have 46400 1727204624.93568: results queue empty 46400 1727204624.93569: checking for any_errors_fatal 46400 1727204624.93577: done checking for any_errors_fatal 46400 1727204624.93578: checking for max_fail_percentage 46400 1727204624.93580: done checking for max_fail_percentage 46400 1727204624.93581: checking to see if all hosts have failed and the running result is not ok 46400 1727204624.93582: done checking to see if all hosts have failed 46400 1727204624.93583: getting the remaining hosts for this loop 46400 1727204624.93585: done getting the remaining hosts for this loop 46400 1727204624.93589: getting the next task for host managed-node2 46400 1727204624.93598: done getting next task for host managed-node2 46400 1727204624.93603: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204624.93608: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204624.93635: getting variables 46400 1727204624.93637: in VariableManager get_vars() 46400 1727204624.93693: Calling all_inventory to load vars for managed-node2 46400 1727204624.93696: Calling groups_inventory to load vars for managed-node2 46400 1727204624.93699: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204624.93711: Calling all_plugins_play to load vars for managed-node2 46400 1727204624.93714: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204624.93718: Calling groups_plugins_play to load vars for managed-node2 46400 1727204624.94681: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232d 46400 1727204624.94685: WORKER PROCESS EXITING 46400 1727204624.95573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204624.97262: done with get_vars() 46400 1727204624.97325: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204624.97411: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.118) 0:01:55.259 ***** 46400 1727204624.97447: entering _queue_task() for managed-node2/yum 46400 1727204624.97798: worker is 1 (out of 1 available) 46400 1727204624.97811: exiting _queue_task() for managed-node2/yum 46400 1727204624.97824: done queuing things up, now waiting for results queue to drain 46400 1727204624.97826: waiting for pending results... 46400 1727204624.98126: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204624.98296: in run() - task 0affcd87-79f5-1303-fda8-00000000232e 46400 1727204624.98313: variable 'ansible_search_path' from source: unknown 46400 1727204624.98320: variable 'ansible_search_path' from source: unknown 46400 1727204624.98367: calling self._execute() 46400 1727204624.98474: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204624.98489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204624.98502: variable 'omit' from source: magic vars 46400 1727204624.98902: variable 'ansible_distribution_major_version' from source: facts 46400 1727204624.98922: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204624.99107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204625.02004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204625.02098: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204625.02142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204625.02190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204625.02224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204625.02312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.02344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.02380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.02430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.02451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.02570: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.02592: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204625.02602: when evaluation is False, skipping this task 46400 1727204625.02611: _execute() done 46400 1727204625.02619: dumping result to json 46400 1727204625.02626: done dumping result, returning 46400 1727204625.02637: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000232e] 46400 1727204625.02648: sending task result for task 0affcd87-79f5-1303-fda8-00000000232e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204625.02818: no more pending results, returning what we have 46400 1727204625.02822: results queue empty 46400 1727204625.02824: checking for any_errors_fatal 46400 1727204625.02832: done checking for any_errors_fatal 46400 1727204625.02832: checking for max_fail_percentage 46400 1727204625.02835: done checking for max_fail_percentage 46400 1727204625.02836: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.02837: done checking to see if all hosts have failed 46400 1727204625.02839: getting the remaining hosts for this loop 46400 1727204625.02840: done getting the remaining hosts for this loop 46400 1727204625.02845: getting the next task for host managed-node2 46400 1727204625.02854: done getting next task for host managed-node2 46400 1727204625.02861: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204625.02869: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.02898: getting variables 46400 1727204625.02900: in VariableManager get_vars() 46400 1727204625.02953: Calling all_inventory to load vars for managed-node2 46400 1727204625.02956: Calling groups_inventory to load vars for managed-node2 46400 1727204625.02958: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.02974: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.02977: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.02981: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.04083: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232e 46400 1727204625.04086: WORKER PROCESS EXITING 46400 1727204625.04926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.06671: done with get_vars() 46400 1727204625.06700: done getting variables 46400 1727204625.06767: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.093) 0:01:55.352 ***** 46400 1727204625.06807: entering _queue_task() for managed-node2/fail 46400 1727204625.07174: worker is 1 (out of 1 available) 46400 1727204625.07191: exiting _queue_task() for managed-node2/fail 46400 1727204625.07206: done queuing things up, now waiting for results queue to drain 46400 1727204625.07208: waiting for pending results... 46400 1727204625.07546: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204625.07728: in run() - task 0affcd87-79f5-1303-fda8-00000000232f 46400 1727204625.07750: variable 'ansible_search_path' from source: unknown 46400 1727204625.07762: variable 'ansible_search_path' from source: unknown 46400 1727204625.07812: calling self._execute() 46400 1727204625.07923: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.07934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.07949: variable 'omit' from source: magic vars 46400 1727204625.08377: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.08396: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.08542: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.08755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204625.11096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204625.11153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204625.11185: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204625.11212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204625.11233: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204625.11298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.11319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.11337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.11366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.11380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.11415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.11431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.11449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.11479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.11490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.11524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.11538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.11554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.11585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.11596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.11723: variable 'network_connections' from source: include params 46400 1727204625.11733: variable 'interface' from source: play vars 46400 1727204625.11789: variable 'interface' from source: play vars 46400 1727204625.11841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204625.11961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204625.11997: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204625.12021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204625.12045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204625.12081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204625.12097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204625.12113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.12133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204625.12178: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204625.12397: variable 'network_connections' from source: include params 46400 1727204625.12401: variable 'interface' from source: play vars 46400 1727204625.12463: variable 'interface' from source: play vars 46400 1727204625.12596: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204625.12598: when evaluation is False, skipping this task 46400 1727204625.12600: _execute() done 46400 1727204625.12602: dumping result to json 46400 1727204625.12604: done dumping result, returning 46400 1727204625.12606: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000232f] 46400 1727204625.12607: sending task result for task 0affcd87-79f5-1303-fda8-00000000232f 46400 1727204625.12680: done sending task result for task 0affcd87-79f5-1303-fda8-00000000232f 46400 1727204625.12683: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204625.12729: no more pending results, returning what we have 46400 1727204625.12733: results queue empty 46400 1727204625.12734: checking for any_errors_fatal 46400 1727204625.12742: done checking for any_errors_fatal 46400 1727204625.12743: checking for max_fail_percentage 46400 1727204625.12745: done checking for max_fail_percentage 46400 1727204625.12746: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.12747: done checking to see if all hosts have failed 46400 1727204625.12748: getting the remaining hosts for this loop 46400 1727204625.12749: done getting the remaining hosts for this loop 46400 1727204625.12753: getting the next task for host managed-node2 46400 1727204625.12760: done getting next task for host managed-node2 46400 1727204625.12766: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204625.12770: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.12791: getting variables 46400 1727204625.12793: in VariableManager get_vars() 46400 1727204625.12833: Calling all_inventory to load vars for managed-node2 46400 1727204625.12836: Calling groups_inventory to load vars for managed-node2 46400 1727204625.12839: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.12849: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.12852: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.12854: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.14209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.15295: done with get_vars() 46400 1727204625.15312: done getting variables 46400 1727204625.15357: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.085) 0:01:55.438 ***** 46400 1727204625.15390: entering _queue_task() for managed-node2/package 46400 1727204625.15634: worker is 1 (out of 1 available) 46400 1727204625.15653: exiting _queue_task() for managed-node2/package 46400 1727204625.15685: done queuing things up, now waiting for results queue to drain 46400 1727204625.15687: waiting for pending results... 46400 1727204625.15970: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204625.16151: in run() - task 0affcd87-79f5-1303-fda8-000000002330 46400 1727204625.16172: variable 'ansible_search_path' from source: unknown 46400 1727204625.16180: variable 'ansible_search_path' from source: unknown 46400 1727204625.16235: calling self._execute() 46400 1727204625.16351: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.16370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.16384: variable 'omit' from source: magic vars 46400 1727204625.16810: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.16828: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.17042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204625.17335: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204625.17394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204625.17435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204625.17553: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204625.17776: variable 'network_packages' from source: role '' defaults 46400 1727204625.17928: variable '__network_provider_setup' from source: role '' defaults 46400 1727204625.17937: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204625.17990: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204625.18003: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204625.18069: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204625.18188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204625.19798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204625.19876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204625.19923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204625.19963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204625.20005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204625.20119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.20150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.20191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.20247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.20270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.20340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.20373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.20405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.20462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.20484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.20772: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204625.20903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.20934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.20981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.21030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.21053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.21163: variable 'ansible_python' from source: facts 46400 1727204625.21197: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204625.21303: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204625.21402: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204625.21548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.21581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.21626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.21668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.21681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.21769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.21781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.21784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.21821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.21845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.22000: variable 'network_connections' from source: include params 46400 1727204625.22005: variable 'interface' from source: play vars 46400 1727204625.22113: variable 'interface' from source: play vars 46400 1727204625.22209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204625.22245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204625.22319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.22359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204625.22423: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.22753: variable 'network_connections' from source: include params 46400 1727204625.22765: variable 'interface' from source: play vars 46400 1727204625.22883: variable 'interface' from source: play vars 46400 1727204625.22922: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204625.23037: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.23482: variable 'network_connections' from source: include params 46400 1727204625.23486: variable 'interface' from source: play vars 46400 1727204625.23534: variable 'interface' from source: play vars 46400 1727204625.23552: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204625.23617: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204625.23826: variable 'network_connections' from source: include params 46400 1727204625.23829: variable 'interface' from source: play vars 46400 1727204625.23879: variable 'interface' from source: play vars 46400 1727204625.23921: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204625.23966: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204625.23972: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204625.24014: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204625.24157: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204625.24475: variable 'network_connections' from source: include params 46400 1727204625.24478: variable 'interface' from source: play vars 46400 1727204625.24520: variable 'interface' from source: play vars 46400 1727204625.24527: variable 'ansible_distribution' from source: facts 46400 1727204625.24529: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.24536: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.24547: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204625.24662: variable 'ansible_distribution' from source: facts 46400 1727204625.24671: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.24674: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.24686: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204625.24795: variable 'ansible_distribution' from source: facts 46400 1727204625.24798: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.24803: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.24830: variable 'network_provider' from source: set_fact 46400 1727204625.24844: variable 'ansible_facts' from source: unknown 46400 1727204625.25499: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204625.25503: when evaluation is False, skipping this task 46400 1727204625.25505: _execute() done 46400 1727204625.25508: dumping result to json 46400 1727204625.25510: done dumping result, returning 46400 1727204625.25520: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-000000002330] 46400 1727204625.25527: sending task result for task 0affcd87-79f5-1303-fda8-000000002330 46400 1727204625.25628: done sending task result for task 0affcd87-79f5-1303-fda8-000000002330 46400 1727204625.25631: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204625.25680: no more pending results, returning what we have 46400 1727204625.25684: results queue empty 46400 1727204625.25685: checking for any_errors_fatal 46400 1727204625.25694: done checking for any_errors_fatal 46400 1727204625.25695: checking for max_fail_percentage 46400 1727204625.25697: done checking for max_fail_percentage 46400 1727204625.25697: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.25698: done checking to see if all hosts have failed 46400 1727204625.25699: getting the remaining hosts for this loop 46400 1727204625.25701: done getting the remaining hosts for this loop 46400 1727204625.25705: getting the next task for host managed-node2 46400 1727204625.25713: done getting next task for host managed-node2 46400 1727204625.25718: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204625.25722: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.25748: getting variables 46400 1727204625.25749: in VariableManager get_vars() 46400 1727204625.25801: Calling all_inventory to load vars for managed-node2 46400 1727204625.25803: Calling groups_inventory to load vars for managed-node2 46400 1727204625.25806: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.25816: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.25818: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.25820: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.27048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.28017: done with get_vars() 46400 1727204625.28040: done getting variables 46400 1727204625.28092: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.127) 0:01:55.565 ***** 46400 1727204625.28120: entering _queue_task() for managed-node2/package 46400 1727204625.28383: worker is 1 (out of 1 available) 46400 1727204625.28395: exiting _queue_task() for managed-node2/package 46400 1727204625.28408: done queuing things up, now waiting for results queue to drain 46400 1727204625.28410: waiting for pending results... 46400 1727204625.28616: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204625.28719: in run() - task 0affcd87-79f5-1303-fda8-000000002331 46400 1727204625.28736: variable 'ansible_search_path' from source: unknown 46400 1727204625.28741: variable 'ansible_search_path' from source: unknown 46400 1727204625.28771: calling self._execute() 46400 1727204625.28852: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.28856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.28868: variable 'omit' from source: magic vars 46400 1727204625.29170: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.29177: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.29264: variable 'network_state' from source: role '' defaults 46400 1727204625.29273: Evaluated conditional (network_state != {}): False 46400 1727204625.29277: when evaluation is False, skipping this task 46400 1727204625.29281: _execute() done 46400 1727204625.29283: dumping result to json 46400 1727204625.29285: done dumping result, returning 46400 1727204625.29294: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000002331] 46400 1727204625.29301: sending task result for task 0affcd87-79f5-1303-fda8-000000002331 46400 1727204625.29395: done sending task result for task 0affcd87-79f5-1303-fda8-000000002331 46400 1727204625.29398: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204625.29449: no more pending results, returning what we have 46400 1727204625.29453: results queue empty 46400 1727204625.29454: checking for any_errors_fatal 46400 1727204625.29467: done checking for any_errors_fatal 46400 1727204625.29468: checking for max_fail_percentage 46400 1727204625.29470: done checking for max_fail_percentage 46400 1727204625.29471: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.29472: done checking to see if all hosts have failed 46400 1727204625.29473: getting the remaining hosts for this loop 46400 1727204625.29475: done getting the remaining hosts for this loop 46400 1727204625.29478: getting the next task for host managed-node2 46400 1727204625.29487: done getting next task for host managed-node2 46400 1727204625.29492: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204625.29496: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.29528: getting variables 46400 1727204625.29530: in VariableManager get_vars() 46400 1727204625.29576: Calling all_inventory to load vars for managed-node2 46400 1727204625.29579: Calling groups_inventory to load vars for managed-node2 46400 1727204625.29581: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.29590: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.29592: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.29595: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.30602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.31529: done with get_vars() 46400 1727204625.31546: done getting variables 46400 1727204625.31596: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.035) 0:01:55.600 ***** 46400 1727204625.31623: entering _queue_task() for managed-node2/package 46400 1727204625.31872: worker is 1 (out of 1 available) 46400 1727204625.31886: exiting _queue_task() for managed-node2/package 46400 1727204625.31899: done queuing things up, now waiting for results queue to drain 46400 1727204625.31900: waiting for pending results... 46400 1727204625.32098: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204625.32198: in run() - task 0affcd87-79f5-1303-fda8-000000002332 46400 1727204625.32209: variable 'ansible_search_path' from source: unknown 46400 1727204625.32212: variable 'ansible_search_path' from source: unknown 46400 1727204625.32246: calling self._execute() 46400 1727204625.32321: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.32325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.32335: variable 'omit' from source: magic vars 46400 1727204625.32623: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.32632: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.32723: variable 'network_state' from source: role '' defaults 46400 1727204625.32732: Evaluated conditional (network_state != {}): False 46400 1727204625.32736: when evaluation is False, skipping this task 46400 1727204625.32739: _execute() done 46400 1727204625.32742: dumping result to json 46400 1727204625.32745: done dumping result, returning 46400 1727204625.32750: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-000000002332] 46400 1727204625.32755: sending task result for task 0affcd87-79f5-1303-fda8-000000002332 46400 1727204625.32855: done sending task result for task 0affcd87-79f5-1303-fda8-000000002332 46400 1727204625.32857: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204625.32933: no more pending results, returning what we have 46400 1727204625.32936: results queue empty 46400 1727204625.32937: checking for any_errors_fatal 46400 1727204625.32943: done checking for any_errors_fatal 46400 1727204625.32944: checking for max_fail_percentage 46400 1727204625.32945: done checking for max_fail_percentage 46400 1727204625.32946: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.32947: done checking to see if all hosts have failed 46400 1727204625.32948: getting the remaining hosts for this loop 46400 1727204625.32949: done getting the remaining hosts for this loop 46400 1727204625.32953: getting the next task for host managed-node2 46400 1727204625.32962: done getting next task for host managed-node2 46400 1727204625.32968: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204625.32972: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.33002: getting variables 46400 1727204625.33003: in VariableManager get_vars() 46400 1727204625.33040: Calling all_inventory to load vars for managed-node2 46400 1727204625.33043: Calling groups_inventory to load vars for managed-node2 46400 1727204625.33045: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.33054: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.33056: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.33058: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.33887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.34839: done with get_vars() 46400 1727204625.34858: done getting variables 46400 1727204625.34907: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.033) 0:01:55.633 ***** 46400 1727204625.34935: entering _queue_task() for managed-node2/service 46400 1727204625.35184: worker is 1 (out of 1 available) 46400 1727204625.35198: exiting _queue_task() for managed-node2/service 46400 1727204625.35212: done queuing things up, now waiting for results queue to drain 46400 1727204625.35214: waiting for pending results... 46400 1727204625.35419: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204625.35528: in run() - task 0affcd87-79f5-1303-fda8-000000002333 46400 1727204625.35540: variable 'ansible_search_path' from source: unknown 46400 1727204625.35545: variable 'ansible_search_path' from source: unknown 46400 1727204625.35578: calling self._execute() 46400 1727204625.35659: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.35668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.35676: variable 'omit' from source: magic vars 46400 1727204625.35973: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.35984: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.36073: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.36215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204625.38275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204625.38340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204625.38394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204625.38429: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204625.38455: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204625.38538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.38573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.38602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.38644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.38658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.38709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.38733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.38759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.38802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.38817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.38857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.38886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.38912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.38950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.38969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.39149: variable 'network_connections' from source: include params 46400 1727204625.39162: variable 'interface' from source: play vars 46400 1727204625.39237: variable 'interface' from source: play vars 46400 1727204625.39310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204625.39484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204625.39523: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204625.39554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204625.39599: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204625.39643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204625.39671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204625.39698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.39723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204625.39776: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204625.40030: variable 'network_connections' from source: include params 46400 1727204625.40035: variable 'interface' from source: play vars 46400 1727204625.40106: variable 'interface' from source: play vars 46400 1727204625.40132: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204625.40136: when evaluation is False, skipping this task 46400 1727204625.40139: _execute() done 46400 1727204625.40142: dumping result to json 46400 1727204625.40144: done dumping result, returning 46400 1727204625.40152: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000002333] 46400 1727204625.40159: sending task result for task 0affcd87-79f5-1303-fda8-000000002333 46400 1727204625.40275: done sending task result for task 0affcd87-79f5-1303-fda8-000000002333 46400 1727204625.40285: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204625.40329: no more pending results, returning what we have 46400 1727204625.40333: results queue empty 46400 1727204625.40334: checking for any_errors_fatal 46400 1727204625.40341: done checking for any_errors_fatal 46400 1727204625.40342: checking for max_fail_percentage 46400 1727204625.40344: done checking for max_fail_percentage 46400 1727204625.40345: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.40346: done checking to see if all hosts have failed 46400 1727204625.40347: getting the remaining hosts for this loop 46400 1727204625.40348: done getting the remaining hosts for this loop 46400 1727204625.40352: getting the next task for host managed-node2 46400 1727204625.40361: done getting next task for host managed-node2 46400 1727204625.40367: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204625.40372: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.40398: getting variables 46400 1727204625.40399: in VariableManager get_vars() 46400 1727204625.40446: Calling all_inventory to load vars for managed-node2 46400 1727204625.40449: Calling groups_inventory to load vars for managed-node2 46400 1727204625.40451: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.40461: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.40465: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.40468: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.42112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204625.43053: done with get_vars() 46400 1727204625.43076: done getting variables 46400 1727204625.43121: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:45 -0400 (0:00:00.082) 0:01:55.716 ***** 46400 1727204625.43150: entering _queue_task() for managed-node2/service 46400 1727204625.43410: worker is 1 (out of 1 available) 46400 1727204625.43425: exiting _queue_task() for managed-node2/service 46400 1727204625.43438: done queuing things up, now waiting for results queue to drain 46400 1727204625.43439: waiting for pending results... 46400 1727204625.43633: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204625.43747: in run() - task 0affcd87-79f5-1303-fda8-000000002334 46400 1727204625.43758: variable 'ansible_search_path' from source: unknown 46400 1727204625.43766: variable 'ansible_search_path' from source: unknown 46400 1727204625.43799: calling self._execute() 46400 1727204625.43875: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.43879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.43889: variable 'omit' from source: magic vars 46400 1727204625.44185: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.44194: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204625.44312: variable 'network_provider' from source: set_fact 46400 1727204625.44318: variable 'network_state' from source: role '' defaults 46400 1727204625.44325: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204625.44335: variable 'omit' from source: magic vars 46400 1727204625.44379: variable 'omit' from source: magic vars 46400 1727204625.44398: variable 'network_service_name' from source: role '' defaults 46400 1727204625.44450: variable 'network_service_name' from source: role '' defaults 46400 1727204625.44522: variable '__network_provider_setup' from source: role '' defaults 46400 1727204625.44528: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204625.44582: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204625.44589: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204625.44634: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204625.44793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204625.46385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204625.46441: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204625.46471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204625.46498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204625.46521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204625.46580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.46600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.46620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.46650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.46665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.46696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.46711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.46731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.46762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.46773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.46932: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204625.47013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.47029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.47046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.47079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.47090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.47152: variable 'ansible_python' from source: facts 46400 1727204625.47172: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204625.47227: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204625.47287: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204625.47368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.47392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.47408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.47433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.47444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.47479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204625.47501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204625.47519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.47544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204625.47556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204625.47654: variable 'network_connections' from source: include params 46400 1727204625.47662: variable 'interface' from source: play vars 46400 1727204625.47713: variable 'interface' from source: play vars 46400 1727204625.47789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204625.47927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204625.47966: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204625.47995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204625.48023: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204625.48072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204625.48094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204625.48115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204625.48138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204625.48180: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.48365: variable 'network_connections' from source: include params 46400 1727204625.48375: variable 'interface' from source: play vars 46400 1727204625.48420: variable 'interface' from source: play vars 46400 1727204625.48445: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204625.48504: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204625.48691: variable 'network_connections' from source: include params 46400 1727204625.48700: variable 'interface' from source: play vars 46400 1727204625.48747: variable 'interface' from source: play vars 46400 1727204625.48767: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204625.48822: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204625.49015: variable 'network_connections' from source: include params 46400 1727204625.49018: variable 'interface' from source: play vars 46400 1727204625.49069: variable 'interface' from source: play vars 46400 1727204625.49107: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204625.49152: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204625.49158: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204625.49202: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204625.49340: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204625.49668: variable 'network_connections' from source: include params 46400 1727204625.49671: variable 'interface' from source: play vars 46400 1727204625.49716: variable 'interface' from source: play vars 46400 1727204625.49722: variable 'ansible_distribution' from source: facts 46400 1727204625.49725: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.49730: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.49741: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204625.49858: variable 'ansible_distribution' from source: facts 46400 1727204625.49865: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.49868: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.49880: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204625.49992: variable 'ansible_distribution' from source: facts 46400 1727204625.49996: variable '__network_rh_distros' from source: role '' defaults 46400 1727204625.50005: variable 'ansible_distribution_major_version' from source: facts 46400 1727204625.50029: variable 'network_provider' from source: set_fact 46400 1727204625.50047: variable 'omit' from source: magic vars 46400 1727204625.50075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204625.50100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204625.50119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204625.50132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204625.50140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204625.50167: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204625.50170: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.50172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.50239: Set connection var ansible_shell_type to sh 46400 1727204625.50247: Set connection var ansible_shell_executable to /bin/sh 46400 1727204625.50252: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204625.50257: Set connection var ansible_connection to ssh 46400 1727204625.50265: Set connection var ansible_pipelining to False 46400 1727204625.50268: Set connection var ansible_timeout to 10 46400 1727204625.50289: variable 'ansible_shell_executable' from source: unknown 46400 1727204625.50292: variable 'ansible_connection' from source: unknown 46400 1727204625.50294: variable 'ansible_module_compression' from source: unknown 46400 1727204625.50296: variable 'ansible_shell_type' from source: unknown 46400 1727204625.50298: variable 'ansible_shell_executable' from source: unknown 46400 1727204625.50301: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204625.50305: variable 'ansible_pipelining' from source: unknown 46400 1727204625.50307: variable 'ansible_timeout' from source: unknown 46400 1727204625.50313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204625.50388: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204625.50397: variable 'omit' from source: magic vars 46400 1727204625.50403: starting attempt loop 46400 1727204625.50406: running the handler 46400 1727204625.50467: variable 'ansible_facts' from source: unknown 46400 1727204625.51094: _low_level_execute_command(): starting 46400 1727204625.51100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204625.51624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.51667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.51670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.51674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.51676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.51717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204625.51732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.51789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.53451: stdout chunk (state=3): >>>/root <<< 46400 1727204625.53545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204625.53610: stderr chunk (state=3): >>><<< 46400 1727204625.53613: stdout chunk (state=3): >>><<< 46400 1727204625.53634: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204625.53646: _low_level_execute_command(): starting 46400 1727204625.53652: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691 `" && echo ansible-tmp-1727204625.5363414-54397-219368914971691="` echo /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691 `" ) && sleep 0' 46400 1727204625.54147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.54150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.54192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.54195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.54197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204625.54203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.54255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204625.54258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204625.54268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.54307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.56184: stdout chunk (state=3): >>>ansible-tmp-1727204625.5363414-54397-219368914971691=/root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691 <<< 46400 1727204625.56300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204625.56362: stderr chunk (state=3): >>><<< 46400 1727204625.56368: stdout chunk (state=3): >>><<< 46400 1727204625.56387: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204625.5363414-54397-219368914971691=/root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204625.56415: variable 'ansible_module_compression' from source: unknown 46400 1727204625.56468: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204625.56512: variable 'ansible_facts' from source: unknown 46400 1727204625.56649: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/AnsiballZ_systemd.py 46400 1727204625.56778: Sending initial data 46400 1727204625.56781: Sent initial data (156 bytes) 46400 1727204625.57512: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.57522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.57569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.57590: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204625.57613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.57641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204625.57661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204625.57677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204625.57691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.57705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.57716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.57768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204625.57782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204625.57793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.57847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.59575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204625.59615: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204625.59657: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp5snny3y6 /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/AnsiballZ_systemd.py <<< 46400 1727204625.59697: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204625.61831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204625.61931: stderr chunk (state=3): >>><<< 46400 1727204625.61934: stdout chunk (state=3): >>><<< 46400 1727204625.61952: done transferring module to remote 46400 1727204625.61965: _low_level_execute_command(): starting 46400 1727204625.61968: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/ /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/AnsiballZ_systemd.py && sleep 0' 46400 1727204625.62443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.62450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.62501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.62505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.62507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.62509: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.62567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204625.62573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.62621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.64470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204625.64474: stdout chunk (state=3): >>><<< 46400 1727204625.64478: stderr chunk (state=3): >>><<< 46400 1727204625.64501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204625.64510: _low_level_execute_command(): starting 46400 1727204625.64523: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/AnsiballZ_systemd.py && sleep 0' 46400 1727204625.65233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204625.65253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.65279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.65304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.65346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.65370: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204625.65385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.65407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204625.65418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204625.65429: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204625.65440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.65454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.65482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.65495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.65515: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204625.65529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.65620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204625.65646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204625.65669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.65753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.91057: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204625.91093: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6942720", "MemoryAvailable": "infinity", "CPUUsageNSec": "2290392000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204625.91109: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204625.92713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204625.92717: stdout chunk (state=3): >>><<< 46400 1727204625.92720: stderr chunk (state=3): >>><<< 46400 1727204625.92805: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6942720", "MemoryAvailable": "infinity", "CPUUsageNSec": "2290392000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204625.93021: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204625.93046: _low_level_execute_command(): starting 46400 1727204625.93055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204625.5363414-54397-219368914971691/ > /dev/null 2>&1 && sleep 0' 46400 1727204625.94967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204625.95044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.95059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.95081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.95124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.95161: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204625.95260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.95280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204625.95292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204625.95301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204625.95312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204625.95323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204625.95337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204625.95351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204625.95367: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204625.95383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204625.95461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204625.95604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204625.95620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204625.95699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204625.97617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204625.97620: stdout chunk (state=3): >>><<< 46400 1727204625.97623: stderr chunk (state=3): >>><<< 46400 1727204625.97670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204625.97674: handler run complete 46400 1727204625.97875: attempt loop complete, returning result 46400 1727204625.97879: _execute() done 46400 1727204625.97881: dumping result to json 46400 1727204625.97883: done dumping result, returning 46400 1727204625.97885: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-000000002334] 46400 1727204625.97888: sending task result for task 0affcd87-79f5-1303-fda8-000000002334 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204625.98174: no more pending results, returning what we have 46400 1727204625.98180: results queue empty 46400 1727204625.98181: checking for any_errors_fatal 46400 1727204625.98189: done checking for any_errors_fatal 46400 1727204625.98190: checking for max_fail_percentage 46400 1727204625.98192: done checking for max_fail_percentage 46400 1727204625.98193: checking to see if all hosts have failed and the running result is not ok 46400 1727204625.98194: done checking to see if all hosts have failed 46400 1727204625.98194: getting the remaining hosts for this loop 46400 1727204625.98196: done getting the remaining hosts for this loop 46400 1727204625.98200: getting the next task for host managed-node2 46400 1727204625.98210: done getting next task for host managed-node2 46400 1727204625.98216: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204625.98221: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204625.98236: getting variables 46400 1727204625.98238: in VariableManager get_vars() 46400 1727204625.98284: Calling all_inventory to load vars for managed-node2 46400 1727204625.98287: Calling groups_inventory to load vars for managed-node2 46400 1727204625.98290: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204625.98301: Calling all_plugins_play to load vars for managed-node2 46400 1727204625.98304: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204625.98307: Calling groups_plugins_play to load vars for managed-node2 46400 1727204625.99029: done sending task result for task 0affcd87-79f5-1303-fda8-000000002334 46400 1727204625.99037: WORKER PROCESS EXITING 46400 1727204626.01706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204626.05477: done with get_vars() 46400 1727204626.05517: done getting variables 46400 1727204626.05702: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.625) 0:01:56.341 ***** 46400 1727204626.05743: entering _queue_task() for managed-node2/service 46400 1727204626.06515: worker is 1 (out of 1 available) 46400 1727204626.06646: exiting _queue_task() for managed-node2/service 46400 1727204626.06660: done queuing things up, now waiting for results queue to drain 46400 1727204626.06661: waiting for pending results... 46400 1727204626.07679: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204626.08085: in run() - task 0affcd87-79f5-1303-fda8-000000002335 46400 1727204626.08107: variable 'ansible_search_path' from source: unknown 46400 1727204626.08115: variable 'ansible_search_path' from source: unknown 46400 1727204626.08283: calling self._execute() 46400 1727204626.08515: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.08527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.08540: variable 'omit' from source: magic vars 46400 1727204626.09484: variable 'ansible_distribution_major_version' from source: facts 46400 1727204626.09501: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204626.09650: variable 'network_provider' from source: set_fact 46400 1727204626.09801: Evaluated conditional (network_provider == "nm"): True 46400 1727204626.09980: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204626.10188: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204626.10608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204626.16019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204626.16217: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204626.16281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204626.16412: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204626.16447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204626.16676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204626.16836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204626.16893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204626.16951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204626.17000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204626.17086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204626.17280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204626.17309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204626.17469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204626.17492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204626.17536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204626.17575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204626.17606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204626.17727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204626.17744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204626.18073: variable 'network_connections' from source: include params 46400 1727204626.18139: variable 'interface' from source: play vars 46400 1727204626.18310: variable 'interface' from source: play vars 46400 1727204626.18417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204626.19434: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204626.19716: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204626.19749: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204626.19853: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204626.20031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204626.20223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204626.20289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204626.20402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204626.20467: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204626.21226: variable 'network_connections' from source: include params 46400 1727204626.21383: variable 'interface' from source: play vars 46400 1727204626.21454: variable 'interface' from source: play vars 46400 1727204626.21607: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204626.21615: when evaluation is False, skipping this task 46400 1727204626.21621: _execute() done 46400 1727204626.21628: dumping result to json 46400 1727204626.21635: done dumping result, returning 46400 1727204626.21648: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-000000002335] 46400 1727204626.21672: sending task result for task 0affcd87-79f5-1303-fda8-000000002335 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204626.21841: no more pending results, returning what we have 46400 1727204626.21846: results queue empty 46400 1727204626.21847: checking for any_errors_fatal 46400 1727204626.21878: done checking for any_errors_fatal 46400 1727204626.21879: checking for max_fail_percentage 46400 1727204626.21881: done checking for max_fail_percentage 46400 1727204626.21882: checking to see if all hosts have failed and the running result is not ok 46400 1727204626.21883: done checking to see if all hosts have failed 46400 1727204626.21884: getting the remaining hosts for this loop 46400 1727204626.21886: done getting the remaining hosts for this loop 46400 1727204626.21890: getting the next task for host managed-node2 46400 1727204626.21903: done getting next task for host managed-node2 46400 1727204626.21907: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204626.21913: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204626.21937: getting variables 46400 1727204626.21939: in VariableManager get_vars() 46400 1727204626.21993: Calling all_inventory to load vars for managed-node2 46400 1727204626.21996: Calling groups_inventory to load vars for managed-node2 46400 1727204626.21999: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204626.22010: Calling all_plugins_play to load vars for managed-node2 46400 1727204626.22013: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204626.22016: Calling groups_plugins_play to load vars for managed-node2 46400 1727204626.23371: done sending task result for task 0affcd87-79f5-1303-fda8-000000002335 46400 1727204626.23375: WORKER PROCESS EXITING 46400 1727204626.25224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204626.30343: done with get_vars() 46400 1727204626.30391: done getting variables 46400 1727204626.30455: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.247) 0:01:56.589 ***** 46400 1727204626.30500: entering _queue_task() for managed-node2/service 46400 1727204626.31126: worker is 1 (out of 1 available) 46400 1727204626.31139: exiting _queue_task() for managed-node2/service 46400 1727204626.31270: done queuing things up, now waiting for results queue to drain 46400 1727204626.31272: waiting for pending results... 46400 1727204626.32117: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204626.32293: in run() - task 0affcd87-79f5-1303-fda8-000000002336 46400 1727204626.32312: variable 'ansible_search_path' from source: unknown 46400 1727204626.32320: variable 'ansible_search_path' from source: unknown 46400 1727204626.32370: calling self._execute() 46400 1727204626.32481: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.32492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.32505: variable 'omit' from source: magic vars 46400 1727204626.32917: variable 'ansible_distribution_major_version' from source: facts 46400 1727204626.32955: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204626.33091: variable 'network_provider' from source: set_fact 46400 1727204626.33102: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204626.33111: when evaluation is False, skipping this task 46400 1727204626.33123: _execute() done 46400 1727204626.33132: dumping result to json 46400 1727204626.33138: done dumping result, returning 46400 1727204626.33148: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-000000002336] 46400 1727204626.33158: sending task result for task 0affcd87-79f5-1303-fda8-000000002336 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204626.33318: no more pending results, returning what we have 46400 1727204626.33323: results queue empty 46400 1727204626.33324: checking for any_errors_fatal 46400 1727204626.33335: done checking for any_errors_fatal 46400 1727204626.33335: checking for max_fail_percentage 46400 1727204626.33337: done checking for max_fail_percentage 46400 1727204626.33338: checking to see if all hosts have failed and the running result is not ok 46400 1727204626.33339: done checking to see if all hosts have failed 46400 1727204626.33340: getting the remaining hosts for this loop 46400 1727204626.33341: done getting the remaining hosts for this loop 46400 1727204626.33345: getting the next task for host managed-node2 46400 1727204626.33356: done getting next task for host managed-node2 46400 1727204626.33365: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204626.33371: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204626.33400: getting variables 46400 1727204626.33402: in VariableManager get_vars() 46400 1727204626.33452: Calling all_inventory to load vars for managed-node2 46400 1727204626.33455: Calling groups_inventory to load vars for managed-node2 46400 1727204626.33458: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204626.33475: Calling all_plugins_play to load vars for managed-node2 46400 1727204626.33478: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204626.33481: Calling groups_plugins_play to load vars for managed-node2 46400 1727204626.34710: done sending task result for task 0affcd87-79f5-1303-fda8-000000002336 46400 1727204626.34714: WORKER PROCESS EXITING 46400 1727204626.36006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204626.37825: done with get_vars() 46400 1727204626.37858: done getting variables 46400 1727204626.37923: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.074) 0:01:56.664 ***** 46400 1727204626.37969: entering _queue_task() for managed-node2/copy 46400 1727204626.38349: worker is 1 (out of 1 available) 46400 1727204626.38369: exiting _queue_task() for managed-node2/copy 46400 1727204626.38386: done queuing things up, now waiting for results queue to drain 46400 1727204626.38388: waiting for pending results... 46400 1727204626.38707: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204626.38890: in run() - task 0affcd87-79f5-1303-fda8-000000002337 46400 1727204626.38910: variable 'ansible_search_path' from source: unknown 46400 1727204626.38917: variable 'ansible_search_path' from source: unknown 46400 1727204626.38968: calling self._execute() 46400 1727204626.39076: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.39088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.39102: variable 'omit' from source: magic vars 46400 1727204626.39500: variable 'ansible_distribution_major_version' from source: facts 46400 1727204626.39517: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204626.39642: variable 'network_provider' from source: set_fact 46400 1727204626.39653: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204626.39665: when evaluation is False, skipping this task 46400 1727204626.39672: _execute() done 46400 1727204626.39679: dumping result to json 46400 1727204626.39686: done dumping result, returning 46400 1727204626.39701: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-000000002337] 46400 1727204626.39715: sending task result for task 0affcd87-79f5-1303-fda8-000000002337 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204626.39878: no more pending results, returning what we have 46400 1727204626.39882: results queue empty 46400 1727204626.39883: checking for any_errors_fatal 46400 1727204626.39891: done checking for any_errors_fatal 46400 1727204626.39891: checking for max_fail_percentage 46400 1727204626.39894: done checking for max_fail_percentage 46400 1727204626.39895: checking to see if all hosts have failed and the running result is not ok 46400 1727204626.39895: done checking to see if all hosts have failed 46400 1727204626.39896: getting the remaining hosts for this loop 46400 1727204626.39898: done getting the remaining hosts for this loop 46400 1727204626.39902: getting the next task for host managed-node2 46400 1727204626.39911: done getting next task for host managed-node2 46400 1727204626.39915: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204626.39921: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204626.39952: getting variables 46400 1727204626.39954: in VariableManager get_vars() 46400 1727204626.40007: Calling all_inventory to load vars for managed-node2 46400 1727204626.40011: Calling groups_inventory to load vars for managed-node2 46400 1727204626.40013: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204626.40026: Calling all_plugins_play to load vars for managed-node2 46400 1727204626.40029: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204626.40032: Calling groups_plugins_play to load vars for managed-node2 46400 1727204626.41005: done sending task result for task 0affcd87-79f5-1303-fda8-000000002337 46400 1727204626.41009: WORKER PROCESS EXITING 46400 1727204626.41949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204626.45213: done with get_vars() 46400 1727204626.45254: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.073) 0:01:56.738 ***** 46400 1727204626.45352: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204626.45731: worker is 1 (out of 1 available) 46400 1727204626.45743: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204626.45757: done queuing things up, now waiting for results queue to drain 46400 1727204626.45759: waiting for pending results... 46400 1727204626.46326: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204626.46547: in run() - task 0affcd87-79f5-1303-fda8-000000002338 46400 1727204626.46576: variable 'ansible_search_path' from source: unknown 46400 1727204626.46588: variable 'ansible_search_path' from source: unknown 46400 1727204626.46695: calling self._execute() 46400 1727204626.46811: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.46823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.46839: variable 'omit' from source: magic vars 46400 1727204626.47302: variable 'ansible_distribution_major_version' from source: facts 46400 1727204626.47318: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204626.47328: variable 'omit' from source: magic vars 46400 1727204626.47409: variable 'omit' from source: magic vars 46400 1727204626.47599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204626.51516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204626.51600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204626.51662: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204626.51771: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204626.51845: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204626.51940: variable 'network_provider' from source: set_fact 46400 1727204626.52103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204626.52136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204626.52181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204626.52229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204626.52249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204626.52509: variable 'omit' from source: magic vars 46400 1727204626.52635: variable 'omit' from source: magic vars 46400 1727204626.52749: variable 'network_connections' from source: include params 46400 1727204626.52771: variable 'interface' from source: play vars 46400 1727204626.52841: variable 'interface' from source: play vars 46400 1727204626.53059: variable 'omit' from source: magic vars 46400 1727204626.53098: variable '__lsr_ansible_managed' from source: task vars 46400 1727204626.53172: variable '__lsr_ansible_managed' from source: task vars 46400 1727204626.53490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204626.53666: Loaded config def from plugin (lookup/template) 46400 1727204626.53669: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204626.53698: File lookup term: get_ansible_managed.j2 46400 1727204626.53702: variable 'ansible_search_path' from source: unknown 46400 1727204626.53705: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204626.53718: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204626.53738: variable 'ansible_search_path' from source: unknown 46400 1727204626.60912: variable 'ansible_managed' from source: unknown 46400 1727204626.61017: variable 'omit' from source: magic vars 46400 1727204626.61039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204626.61058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204626.61077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204626.61093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204626.61101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204626.61122: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204626.61125: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.61128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.61192: Set connection var ansible_shell_type to sh 46400 1727204626.61204: Set connection var ansible_shell_executable to /bin/sh 46400 1727204626.61207: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204626.61210: Set connection var ansible_connection to ssh 46400 1727204626.61214: Set connection var ansible_pipelining to False 46400 1727204626.61219: Set connection var ansible_timeout to 10 46400 1727204626.61475: variable 'ansible_shell_executable' from source: unknown 46400 1727204626.61479: variable 'ansible_connection' from source: unknown 46400 1727204626.61481: variable 'ansible_module_compression' from source: unknown 46400 1727204626.61483: variable 'ansible_shell_type' from source: unknown 46400 1727204626.61485: variable 'ansible_shell_executable' from source: unknown 46400 1727204626.61488: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204626.61490: variable 'ansible_pipelining' from source: unknown 46400 1727204626.61491: variable 'ansible_timeout' from source: unknown 46400 1727204626.61493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204626.61495: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204626.61506: variable 'omit' from source: magic vars 46400 1727204626.61508: starting attempt loop 46400 1727204626.61510: running the handler 46400 1727204626.61513: _low_level_execute_command(): starting 46400 1727204626.61515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204626.62119: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204626.62141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.62168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204626.62195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.62255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204626.62258: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204626.62609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204626.62613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.62655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204626.64307: stdout chunk (state=3): >>>/root <<< 46400 1727204626.64486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204626.64489: stdout chunk (state=3): >>><<< 46400 1727204626.64493: stderr chunk (state=3): >>><<< 46400 1727204626.64510: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204626.64520: _low_level_execute_command(): starting 46400 1727204626.64526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709 `" && echo ansible-tmp-1727204626.6450992-54443-221734045509709="` echo /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709 `" ) && sleep 0' 46400 1727204626.64988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.64994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.65039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.65042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204626.65044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.65097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204626.65103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.65151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204626.66988: stdout chunk (state=3): >>>ansible-tmp-1727204626.6450992-54443-221734045509709=/root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709 <<< 46400 1727204626.67107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204626.67157: stderr chunk (state=3): >>><<< 46400 1727204626.67163: stdout chunk (state=3): >>><<< 46400 1727204626.67180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204626.6450992-54443-221734045509709=/root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204626.67219: variable 'ansible_module_compression' from source: unknown 46400 1727204626.67269: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204626.67302: variable 'ansible_facts' from source: unknown 46400 1727204626.67390: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/AnsiballZ_network_connections.py 46400 1727204626.67502: Sending initial data 46400 1727204626.67505: Sent initial data (168 bytes) 46400 1727204626.68196: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.68201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.68231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204626.68243: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204626.68250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.68266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204626.68269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204626.68277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204626.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.68287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204626.68297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.68304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204626.68309: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204626.68313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.68378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204626.68384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204626.68387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.68443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204626.70141: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204626.70148: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204626.70155: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 46400 1727204626.70165: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 46400 1727204626.70169: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 46400 1727204626.70188: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 46400 1727204626.70190: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204626.70217: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 46400 1727204626.70223: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 46400 1727204626.70228: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 46400 1727204626.70278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpxxgoo9xd /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/AnsiballZ_network_connections.py <<< 46400 1727204626.70310: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204626.71458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204626.71590: stderr chunk (state=3): >>><<< 46400 1727204626.71593: stdout chunk (state=3): >>><<< 46400 1727204626.71615: done transferring module to remote 46400 1727204626.71624: _low_level_execute_command(): starting 46400 1727204626.71629: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/ /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/AnsiballZ_network_connections.py && sleep 0' 46400 1727204626.72098: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.72104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.72138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204626.72145: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.72175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204626.72178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204626.72181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.72226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204626.72229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.72286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204626.73994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204626.74053: stderr chunk (state=3): >>><<< 46400 1727204626.74058: stdout chunk (state=3): >>><<< 46400 1727204626.74078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204626.74081: _low_level_execute_command(): starting 46400 1727204626.74084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/AnsiballZ_network_connections.py && sleep 0' 46400 1727204626.74545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204626.74549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.74586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204626.74601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.74618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.74662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204626.74678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.74731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204626.96721: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204626.98169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204626.98217: stderr chunk (state=3): >>><<< 46400 1727204626.98220: stdout chunk (state=3): >>><<< 46400 1727204626.98241: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204626.98275: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204626.98283: _low_level_execute_command(): starting 46400 1727204626.98287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204626.6450992-54443-221734045509709/ > /dev/null 2>&1 && sleep 0' 46400 1727204626.98734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204626.98737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204626.98771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.98774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204626.98778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204626.98792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204626.98829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204626.98841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204626.98894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.00849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.00853: stdout chunk (state=3): >>><<< 46400 1727204627.00855: stderr chunk (state=3): >>><<< 46400 1727204627.00858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204627.00863: handler run complete 46400 1727204627.00868: attempt loop complete, returning result 46400 1727204627.00870: _execute() done 46400 1727204627.00872: dumping result to json 46400 1727204627.00873: done dumping result, returning 46400 1727204627.00957: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-000000002338] 46400 1727204627.00970: sending task result for task 0affcd87-79f5-1303-fda8-000000002338 46400 1727204627.01038: done sending task result for task 0affcd87-79f5-1303-fda8-000000002338 46400 1727204627.01041: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active 46400 1727204627.01166: no more pending results, returning what we have 46400 1727204627.01171: results queue empty 46400 1727204627.01172: checking for any_errors_fatal 46400 1727204627.01179: done checking for any_errors_fatal 46400 1727204627.01180: checking for max_fail_percentage 46400 1727204627.01182: done checking for max_fail_percentage 46400 1727204627.01182: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.01183: done checking to see if all hosts have failed 46400 1727204627.01184: getting the remaining hosts for this loop 46400 1727204627.01185: done getting the remaining hosts for this loop 46400 1727204627.01189: getting the next task for host managed-node2 46400 1727204627.01197: done getting next task for host managed-node2 46400 1727204627.01201: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204627.01206: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.01218: getting variables 46400 1727204627.01219: in VariableManager get_vars() 46400 1727204627.01265: Calling all_inventory to load vars for managed-node2 46400 1727204627.01268: Calling groups_inventory to load vars for managed-node2 46400 1727204627.01271: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.01280: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.01283: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.01285: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.02690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.03617: done with get_vars() 46400 1727204627.03634: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.583) 0:01:57.321 ***** 46400 1727204627.03705: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204627.03953: worker is 1 (out of 1 available) 46400 1727204627.03970: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204627.03983: done queuing things up, now waiting for results queue to drain 46400 1727204627.03985: waiting for pending results... 46400 1727204627.04209: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204627.04372: in run() - task 0affcd87-79f5-1303-fda8-000000002339 46400 1727204627.04397: variable 'ansible_search_path' from source: unknown 46400 1727204627.04405: variable 'ansible_search_path' from source: unknown 46400 1727204627.04447: calling self._execute() 46400 1727204627.04562: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.04579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.04594: variable 'omit' from source: magic vars 46400 1727204627.04994: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.05012: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.05149: variable 'network_state' from source: role '' defaults 46400 1727204627.05170: Evaluated conditional (network_state != {}): False 46400 1727204627.05179: when evaluation is False, skipping this task 46400 1727204627.05187: _execute() done 46400 1727204627.05195: dumping result to json 46400 1727204627.05203: done dumping result, returning 46400 1727204627.05213: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-000000002339] 46400 1727204627.05225: sending task result for task 0affcd87-79f5-1303-fda8-000000002339 46400 1727204627.05352: done sending task result for task 0affcd87-79f5-1303-fda8-000000002339 46400 1727204627.05357: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204627.05410: no more pending results, returning what we have 46400 1727204627.05415: results queue empty 46400 1727204627.05416: checking for any_errors_fatal 46400 1727204627.05442: done checking for any_errors_fatal 46400 1727204627.05444: checking for max_fail_percentage 46400 1727204627.05447: done checking for max_fail_percentage 46400 1727204627.05448: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.05449: done checking to see if all hosts have failed 46400 1727204627.05449: getting the remaining hosts for this loop 46400 1727204627.05451: done getting the remaining hosts for this loop 46400 1727204627.05455: getting the next task for host managed-node2 46400 1727204627.05490: done getting next task for host managed-node2 46400 1727204627.05494: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204627.05501: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.05528: getting variables 46400 1727204627.05529: in VariableManager get_vars() 46400 1727204627.05581: Calling all_inventory to load vars for managed-node2 46400 1727204627.05585: Calling groups_inventory to load vars for managed-node2 46400 1727204627.05587: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.05598: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.05600: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.05602: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.06453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.07484: done with get_vars() 46400 1727204627.07501: done getting variables 46400 1727204627.07544: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.038) 0:01:57.360 ***** 46400 1727204627.07574: entering _queue_task() for managed-node2/debug 46400 1727204627.07817: worker is 1 (out of 1 available) 46400 1727204627.07831: exiting _queue_task() for managed-node2/debug 46400 1727204627.07844: done queuing things up, now waiting for results queue to drain 46400 1727204627.07845: waiting for pending results... 46400 1727204627.08040: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204627.08133: in run() - task 0affcd87-79f5-1303-fda8-00000000233a 46400 1727204627.08145: variable 'ansible_search_path' from source: unknown 46400 1727204627.08148: variable 'ansible_search_path' from source: unknown 46400 1727204627.08180: calling self._execute() 46400 1727204627.08255: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.08259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.08277: variable 'omit' from source: magic vars 46400 1727204627.08558: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.08570: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.08576: variable 'omit' from source: magic vars 46400 1727204627.08622: variable 'omit' from source: magic vars 46400 1727204627.08647: variable 'omit' from source: magic vars 46400 1727204627.08682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204627.08711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204627.08731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204627.08744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.08753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.08779: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204627.08783: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.08786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.08858: Set connection var ansible_shell_type to sh 46400 1727204627.08868: Set connection var ansible_shell_executable to /bin/sh 46400 1727204627.08873: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204627.08878: Set connection var ansible_connection to ssh 46400 1727204627.08883: Set connection var ansible_pipelining to False 46400 1727204627.08888: Set connection var ansible_timeout to 10 46400 1727204627.08907: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.08909: variable 'ansible_connection' from source: unknown 46400 1727204627.08912: variable 'ansible_module_compression' from source: unknown 46400 1727204627.08914: variable 'ansible_shell_type' from source: unknown 46400 1727204627.08917: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.08921: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.08923: variable 'ansible_pipelining' from source: unknown 46400 1727204627.08925: variable 'ansible_timeout' from source: unknown 46400 1727204627.08929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.09032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204627.09044: variable 'omit' from source: magic vars 46400 1727204627.09053: starting attempt loop 46400 1727204627.09056: running the handler 46400 1727204627.09156: variable '__network_connections_result' from source: set_fact 46400 1727204627.09203: handler run complete 46400 1727204627.09215: attempt loop complete, returning result 46400 1727204627.09218: _execute() done 46400 1727204627.09221: dumping result to json 46400 1727204627.09224: done dumping result, returning 46400 1727204627.09230: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-00000000233a] 46400 1727204627.09235: sending task result for task 0affcd87-79f5-1303-fda8-00000000233a 46400 1727204627.09318: done sending task result for task 0affcd87-79f5-1303-fda8-00000000233a 46400 1727204627.09321: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active" ] } 46400 1727204627.09394: no more pending results, returning what we have 46400 1727204627.09398: results queue empty 46400 1727204627.09399: checking for any_errors_fatal 46400 1727204627.09406: done checking for any_errors_fatal 46400 1727204627.09407: checking for max_fail_percentage 46400 1727204627.09409: done checking for max_fail_percentage 46400 1727204627.09410: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.09411: done checking to see if all hosts have failed 46400 1727204627.09411: getting the remaining hosts for this loop 46400 1727204627.09413: done getting the remaining hosts for this loop 46400 1727204627.09417: getting the next task for host managed-node2 46400 1727204627.09424: done getting next task for host managed-node2 46400 1727204627.09428: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204627.09434: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.09447: getting variables 46400 1727204627.09449: in VariableManager get_vars() 46400 1727204627.09495: Calling all_inventory to load vars for managed-node2 46400 1727204627.09498: Calling groups_inventory to load vars for managed-node2 46400 1727204627.09500: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.09510: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.09512: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.09515: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.10322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.11229: done with get_vars() 46400 1727204627.11248: done getting variables 46400 1727204627.11295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.037) 0:01:57.397 ***** 46400 1727204627.11328: entering _queue_task() for managed-node2/debug 46400 1727204627.11568: worker is 1 (out of 1 available) 46400 1727204627.11582: exiting _queue_task() for managed-node2/debug 46400 1727204627.11594: done queuing things up, now waiting for results queue to drain 46400 1727204627.11596: waiting for pending results... 46400 1727204627.11796: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204627.11897: in run() - task 0affcd87-79f5-1303-fda8-00000000233b 46400 1727204627.11909: variable 'ansible_search_path' from source: unknown 46400 1727204627.11912: variable 'ansible_search_path' from source: unknown 46400 1727204627.11941: calling self._execute() 46400 1727204627.12016: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.12021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.12029: variable 'omit' from source: magic vars 46400 1727204627.12312: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.12321: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.12327: variable 'omit' from source: magic vars 46400 1727204627.12380: variable 'omit' from source: magic vars 46400 1727204627.12409: variable 'omit' from source: magic vars 46400 1727204627.12441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204627.12473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204627.12491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204627.12504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.12514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.12541: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204627.12544: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.12546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.12619: Set connection var ansible_shell_type to sh 46400 1727204627.12632: Set connection var ansible_shell_executable to /bin/sh 46400 1727204627.12636: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204627.12641: Set connection var ansible_connection to ssh 46400 1727204627.12646: Set connection var ansible_pipelining to False 46400 1727204627.12651: Set connection var ansible_timeout to 10 46400 1727204627.12676: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.12680: variable 'ansible_connection' from source: unknown 46400 1727204627.12683: variable 'ansible_module_compression' from source: unknown 46400 1727204627.12685: variable 'ansible_shell_type' from source: unknown 46400 1727204627.12687: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.12690: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.12693: variable 'ansible_pipelining' from source: unknown 46400 1727204627.12696: variable 'ansible_timeout' from source: unknown 46400 1727204627.12700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.12809: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204627.12820: variable 'omit' from source: magic vars 46400 1727204627.12825: starting attempt loop 46400 1727204627.12827: running the handler 46400 1727204627.12873: variable '__network_connections_result' from source: set_fact 46400 1727204627.12926: variable '__network_connections_result' from source: set_fact 46400 1727204627.13005: handler run complete 46400 1727204627.13022: attempt loop complete, returning result 46400 1727204627.13025: _execute() done 46400 1727204627.13029: dumping result to json 46400 1727204627.13032: done dumping result, returning 46400 1727204627.13037: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-00000000233b] 46400 1727204627.13043: sending task result for task 0affcd87-79f5-1303-fda8-00000000233b 46400 1727204627.13129: done sending task result for task 0affcd87-79f5-1303-fda8-00000000233b 46400 1727204627.13132: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e77de78e-51aa-4006-a80a-c43a9ef40807 skipped because already active" ] } } 46400 1727204627.13232: no more pending results, returning what we have 46400 1727204627.13236: results queue empty 46400 1727204627.13237: checking for any_errors_fatal 46400 1727204627.13243: done checking for any_errors_fatal 46400 1727204627.13244: checking for max_fail_percentage 46400 1727204627.13245: done checking for max_fail_percentage 46400 1727204627.13246: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.13247: done checking to see if all hosts have failed 46400 1727204627.13247: getting the remaining hosts for this loop 46400 1727204627.13249: done getting the remaining hosts for this loop 46400 1727204627.13252: getting the next task for host managed-node2 46400 1727204627.13262: done getting next task for host managed-node2 46400 1727204627.13268: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204627.13272: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.13291: getting variables 46400 1727204627.13293: in VariableManager get_vars() 46400 1727204627.13330: Calling all_inventory to load vars for managed-node2 46400 1727204627.13332: Calling groups_inventory to load vars for managed-node2 46400 1727204627.13339: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.13348: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.13350: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.13353: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.18886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.19823: done with get_vars() 46400 1727204627.19844: done getting variables 46400 1727204627.19884: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.085) 0:01:57.483 ***** 46400 1727204627.19906: entering _queue_task() for managed-node2/debug 46400 1727204627.20158: worker is 1 (out of 1 available) 46400 1727204627.20178: exiting _queue_task() for managed-node2/debug 46400 1727204627.20192: done queuing things up, now waiting for results queue to drain 46400 1727204627.20194: waiting for pending results... 46400 1727204627.20386: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204627.20503: in run() - task 0affcd87-79f5-1303-fda8-00000000233c 46400 1727204627.20513: variable 'ansible_search_path' from source: unknown 46400 1727204627.20518: variable 'ansible_search_path' from source: unknown 46400 1727204627.20548: calling self._execute() 46400 1727204627.20625: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.20629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.20639: variable 'omit' from source: magic vars 46400 1727204627.21049: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.21073: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.21234: variable 'network_state' from source: role '' defaults 46400 1727204627.21250: Evaluated conditional (network_state != {}): False 46400 1727204627.21258: when evaluation is False, skipping this task 46400 1727204627.21271: _execute() done 46400 1727204627.21278: dumping result to json 46400 1727204627.21283: done dumping result, returning 46400 1727204627.21292: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-00000000233c] 46400 1727204627.21302: sending task result for task 0affcd87-79f5-1303-fda8-00000000233c 46400 1727204627.21434: done sending task result for task 0affcd87-79f5-1303-fda8-00000000233c skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204627.21494: no more pending results, returning what we have 46400 1727204627.21498: results queue empty 46400 1727204627.21500: checking for any_errors_fatal 46400 1727204627.21510: done checking for any_errors_fatal 46400 1727204627.21511: checking for max_fail_percentage 46400 1727204627.21513: done checking for max_fail_percentage 46400 1727204627.21514: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.21515: done checking to see if all hosts have failed 46400 1727204627.21516: getting the remaining hosts for this loop 46400 1727204627.21518: done getting the remaining hosts for this loop 46400 1727204627.21522: getting the next task for host managed-node2 46400 1727204627.21532: done getting next task for host managed-node2 46400 1727204627.21537: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204627.21543: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.21581: getting variables 46400 1727204627.21583: in VariableManager get_vars() 46400 1727204627.21634: Calling all_inventory to load vars for managed-node2 46400 1727204627.21637: Calling groups_inventory to load vars for managed-node2 46400 1727204627.21641: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.21654: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.21658: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.21673: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.22507: WORKER PROCESS EXITING 46400 1727204627.22845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.23796: done with get_vars() 46400 1727204627.23815: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.039) 0:01:57.523 ***** 46400 1727204627.23893: entering _queue_task() for managed-node2/ping 46400 1727204627.24227: worker is 1 (out of 1 available) 46400 1727204627.24241: exiting _queue_task() for managed-node2/ping 46400 1727204627.24253: done queuing things up, now waiting for results queue to drain 46400 1727204627.24255: waiting for pending results... 46400 1727204627.24586: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204627.24746: in run() - task 0affcd87-79f5-1303-fda8-00000000233d 46400 1727204627.24759: variable 'ansible_search_path' from source: unknown 46400 1727204627.24763: variable 'ansible_search_path' from source: unknown 46400 1727204627.24804: calling self._execute() 46400 1727204627.24913: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.24918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.24929: variable 'omit' from source: magic vars 46400 1727204627.25317: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.25329: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.25335: variable 'omit' from source: magic vars 46400 1727204627.25410: variable 'omit' from source: magic vars 46400 1727204627.25442: variable 'omit' from source: magic vars 46400 1727204627.25491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204627.25526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204627.25547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204627.25567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.25582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204627.25621: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204627.25624: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.25628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.25731: Set connection var ansible_shell_type to sh 46400 1727204627.25740: Set connection var ansible_shell_executable to /bin/sh 46400 1727204627.25745: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204627.25751: Set connection var ansible_connection to ssh 46400 1727204627.25756: Set connection var ansible_pipelining to False 46400 1727204627.25768: Set connection var ansible_timeout to 10 46400 1727204627.25792: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.25796: variable 'ansible_connection' from source: unknown 46400 1727204627.25799: variable 'ansible_module_compression' from source: unknown 46400 1727204627.25808: variable 'ansible_shell_type' from source: unknown 46400 1727204627.25811: variable 'ansible_shell_executable' from source: unknown 46400 1727204627.25813: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.25816: variable 'ansible_pipelining' from source: unknown 46400 1727204627.25818: variable 'ansible_timeout' from source: unknown 46400 1727204627.25823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.26030: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204627.26041: variable 'omit' from source: magic vars 46400 1727204627.26044: starting attempt loop 46400 1727204627.26047: running the handler 46400 1727204627.26066: _low_level_execute_command(): starting 46400 1727204627.26073: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204627.26849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204627.26862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.26894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.26938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.26946: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204627.26956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.26974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204627.26983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204627.26989: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204627.26997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.27012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.27024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.27032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.27039: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204627.27049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.27127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.27148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204627.27160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.27235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.28893: stdout chunk (state=3): >>>/root <<< 46400 1727204627.29077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.29081: stdout chunk (state=3): >>><<< 46400 1727204627.29091: stderr chunk (state=3): >>><<< 46400 1727204627.29121: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204627.29133: _low_level_execute_command(): starting 46400 1727204627.29140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867 `" && echo ansible-tmp-1727204627.2911773-54546-281051898763867="` echo /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867 `" ) && sleep 0' 46400 1727204627.29840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204627.29850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.29860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.29884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.29923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.29930: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204627.29941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.29953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204627.29961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204627.29974: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204627.29981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.29991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.30006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.30013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.30020: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204627.30029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.30108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.30126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204627.30136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.30205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.32067: stdout chunk (state=3): >>>ansible-tmp-1727204627.2911773-54546-281051898763867=/root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867 <<< 46400 1727204627.32278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.32282: stdout chunk (state=3): >>><<< 46400 1727204627.32284: stderr chunk (state=3): >>><<< 46400 1727204627.32437: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204627.2911773-54546-281051898763867=/root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204627.32441: variable 'ansible_module_compression' from source: unknown 46400 1727204627.32444: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204627.32446: variable 'ansible_facts' from source: unknown 46400 1727204627.32503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/AnsiballZ_ping.py 46400 1727204627.32647: Sending initial data 46400 1727204627.32651: Sent initial data (153 bytes) 46400 1727204627.33813: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204627.33825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.33837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.33849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.33892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.33900: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204627.33910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.33925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204627.33936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204627.33943: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204627.33954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.33960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.33978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.33985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.33992: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204627.34002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.34080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.34098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204627.34110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.34183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.35971: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204627.35986: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204627.36031: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpeil0l635 /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/AnsiballZ_ping.py <<< 46400 1727204627.36070: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204627.37419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.37423: stderr chunk (state=3): >>><<< 46400 1727204627.37426: stdout chunk (state=3): >>><<< 46400 1727204627.37428: done transferring module to remote 46400 1727204627.37430: _low_level_execute_command(): starting 46400 1727204627.37437: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/ /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/AnsiballZ_ping.py && sleep 0' 46400 1727204627.38421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204627.38433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.38443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.38457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.38502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.38508: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204627.38519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.38535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204627.38544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204627.38551: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204627.38559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.38575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.38586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.38594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.38600: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204627.38612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.38688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.38703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204627.38707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.38877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.40552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.40608: stderr chunk (state=3): >>><<< 46400 1727204627.40612: stdout chunk (state=3): >>><<< 46400 1727204627.40623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204627.40628: _low_level_execute_command(): starting 46400 1727204627.40630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/AnsiballZ_ping.py && sleep 0' 46400 1727204627.41107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.41112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.41168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.41172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204627.41176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.41219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.41224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.41280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.54157: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204627.55220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204627.55226: stdout chunk (state=3): >>><<< 46400 1727204627.55229: stderr chunk (state=3): >>><<< 46400 1727204627.55252: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204627.55281: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204627.55291: _low_level_execute_command(): starting 46400 1727204627.55294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204627.2911773-54546-281051898763867/ > /dev/null 2>&1 && sleep 0' 46400 1727204627.55948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204627.55958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.55971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.55986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.56025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.56033: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204627.56044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.56057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204627.56073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204627.56079: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204627.56087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204627.56096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204627.56108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204627.56115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204627.56122: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204627.56131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204627.56216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204627.56223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204627.56231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204627.56304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204627.58100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204627.58204: stderr chunk (state=3): >>><<< 46400 1727204627.58216: stdout chunk (state=3): >>><<< 46400 1727204627.58476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204627.58479: handler run complete 46400 1727204627.58482: attempt loop complete, returning result 46400 1727204627.58484: _execute() done 46400 1727204627.58486: dumping result to json 46400 1727204627.58488: done dumping result, returning 46400 1727204627.58490: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-00000000233d] 46400 1727204627.58492: sending task result for task 0affcd87-79f5-1303-fda8-00000000233d 46400 1727204627.58561: done sending task result for task 0affcd87-79f5-1303-fda8-00000000233d 46400 1727204627.58567: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204627.58648: no more pending results, returning what we have 46400 1727204627.58653: results queue empty 46400 1727204627.58655: checking for any_errors_fatal 46400 1727204627.58661: done checking for any_errors_fatal 46400 1727204627.58662: checking for max_fail_percentage 46400 1727204627.58665: done checking for max_fail_percentage 46400 1727204627.58666: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.58667: done checking to see if all hosts have failed 46400 1727204627.58668: getting the remaining hosts for this loop 46400 1727204627.58670: done getting the remaining hosts for this loop 46400 1727204627.58674: getting the next task for host managed-node2 46400 1727204627.58687: done getting next task for host managed-node2 46400 1727204627.58689: ^ task is: TASK: meta (role_complete) 46400 1727204627.58695: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.58710: getting variables 46400 1727204627.58712: in VariableManager get_vars() 46400 1727204627.58767: Calling all_inventory to load vars for managed-node2 46400 1727204627.58770: Calling groups_inventory to load vars for managed-node2 46400 1727204627.58773: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.58784: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.58787: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.58790: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.60743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.62565: done with get_vars() 46400 1727204627.62600: done getting variables 46400 1727204627.62698: done queuing things up, now waiting for results queue to drain 46400 1727204627.62701: results queue empty 46400 1727204627.62701: checking for any_errors_fatal 46400 1727204627.62704: done checking for any_errors_fatal 46400 1727204627.62706: checking for max_fail_percentage 46400 1727204627.62707: done checking for max_fail_percentage 46400 1727204627.62707: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.62708: done checking to see if all hosts have failed 46400 1727204627.62709: getting the remaining hosts for this loop 46400 1727204627.62710: done getting the remaining hosts for this loop 46400 1727204627.62717: getting the next task for host managed-node2 46400 1727204627.62724: done getting next task for host managed-node2 46400 1727204627.62727: ^ task is: TASK: Include network role 46400 1727204627.62729: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.62732: getting variables 46400 1727204627.62733: in VariableManager get_vars() 46400 1727204627.62747: Calling all_inventory to load vars for managed-node2 46400 1727204627.62749: Calling groups_inventory to load vars for managed-node2 46400 1727204627.62751: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.62756: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.62758: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.62760: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.64101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.65905: done with get_vars() 46400 1727204627.65947: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.421) 0:01:57.945 ***** 46400 1727204627.66046: entering _queue_task() for managed-node2/include_role 46400 1727204627.66440: worker is 1 (out of 1 available) 46400 1727204627.66458: exiting _queue_task() for managed-node2/include_role 46400 1727204627.66476: done queuing things up, now waiting for results queue to drain 46400 1727204627.66478: waiting for pending results... 46400 1727204627.66795: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204627.66966: in run() - task 0affcd87-79f5-1303-fda8-000000002142 46400 1727204627.66986: variable 'ansible_search_path' from source: unknown 46400 1727204627.66993: variable 'ansible_search_path' from source: unknown 46400 1727204627.67043: calling self._execute() 46400 1727204627.67159: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.67173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.67190: variable 'omit' from source: magic vars 46400 1727204627.67622: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.67640: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.67650: _execute() done 46400 1727204627.67658: dumping result to json 46400 1727204627.67668: done dumping result, returning 46400 1727204627.67686: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000002142] 46400 1727204627.67698: sending task result for task 0affcd87-79f5-1303-fda8-000000002142 46400 1727204627.67868: no more pending results, returning what we have 46400 1727204627.67874: in VariableManager get_vars() 46400 1727204627.67930: Calling all_inventory to load vars for managed-node2 46400 1727204627.67933: Calling groups_inventory to load vars for managed-node2 46400 1727204627.67938: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.67955: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.67959: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.67963: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.69017: done sending task result for task 0affcd87-79f5-1303-fda8-000000002142 46400 1727204627.69021: WORKER PROCESS EXITING 46400 1727204627.70004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.71724: done with get_vars() 46400 1727204627.71755: variable 'ansible_search_path' from source: unknown 46400 1727204627.71757: variable 'ansible_search_path' from source: unknown 46400 1727204627.71926: variable 'omit' from source: magic vars 46400 1727204627.71973: variable 'omit' from source: magic vars 46400 1727204627.71993: variable 'omit' from source: magic vars 46400 1727204627.71997: we have included files to process 46400 1727204627.71998: generating all_blocks data 46400 1727204627.72000: done generating all_blocks data 46400 1727204627.72005: processing included file: fedora.linux_system_roles.network 46400 1727204627.72026: in VariableManager get_vars() 46400 1727204627.72044: done with get_vars() 46400 1727204627.72079: in VariableManager get_vars() 46400 1727204627.72102: done with get_vars() 46400 1727204627.72140: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204627.72275: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204627.72359: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204627.72872: in VariableManager get_vars() 46400 1727204627.72896: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204627.74850: iterating over new_blocks loaded from include file 46400 1727204627.74852: in VariableManager get_vars() 46400 1727204627.74870: done with get_vars() 46400 1727204627.74872: filtering new block on tags 46400 1727204627.75045: done filtering new block on tags 46400 1727204627.75048: in VariableManager get_vars() 46400 1727204627.75059: done with get_vars() 46400 1727204627.75062: filtering new block on tags 46400 1727204627.75074: done filtering new block on tags 46400 1727204627.75076: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204627.75080: extending task lists for all hosts with included blocks 46400 1727204627.75150: done extending task lists 46400 1727204627.75151: done processing included files 46400 1727204627.75152: results queue empty 46400 1727204627.75152: checking for any_errors_fatal 46400 1727204627.75153: done checking for any_errors_fatal 46400 1727204627.75154: checking for max_fail_percentage 46400 1727204627.75155: done checking for max_fail_percentage 46400 1727204627.75155: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.75156: done checking to see if all hosts have failed 46400 1727204627.75156: getting the remaining hosts for this loop 46400 1727204627.75157: done getting the remaining hosts for this loop 46400 1727204627.75158: getting the next task for host managed-node2 46400 1727204627.75165: done getting next task for host managed-node2 46400 1727204627.75167: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204627.75169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.75177: getting variables 46400 1727204627.75177: in VariableManager get_vars() 46400 1727204627.75187: Calling all_inventory to load vars for managed-node2 46400 1727204627.75188: Calling groups_inventory to load vars for managed-node2 46400 1727204627.75190: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.75193: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.75195: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.75196: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.75943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.77492: done with get_vars() 46400 1727204627.77523: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.115) 0:01:58.060 ***** 46400 1727204627.77620: entering _queue_task() for managed-node2/include_tasks 46400 1727204627.77999: worker is 1 (out of 1 available) 46400 1727204627.78012: exiting _queue_task() for managed-node2/include_tasks 46400 1727204627.78025: done queuing things up, now waiting for results queue to drain 46400 1727204627.78026: waiting for pending results... 46400 1727204627.78362: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204627.78530: in run() - task 0affcd87-79f5-1303-fda8-0000000024a4 46400 1727204627.78534: variable 'ansible_search_path' from source: unknown 46400 1727204627.78537: variable 'ansible_search_path' from source: unknown 46400 1727204627.78541: calling self._execute() 46400 1727204627.78644: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.78648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.78658: variable 'omit' from source: magic vars 46400 1727204627.79024: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.79028: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.79177: _execute() done 46400 1727204627.79181: dumping result to json 46400 1727204627.79183: done dumping result, returning 46400 1727204627.79185: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-0000000024a4] 46400 1727204627.79187: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a4 46400 1727204627.79258: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a4 46400 1727204627.79265: WORKER PROCESS EXITING 46400 1727204627.79322: no more pending results, returning what we have 46400 1727204627.79327: in VariableManager get_vars() 46400 1727204627.79383: Calling all_inventory to load vars for managed-node2 46400 1727204627.79386: Calling groups_inventory to load vars for managed-node2 46400 1727204627.79388: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.79399: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.79402: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.79405: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.80707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.81783: done with get_vars() 46400 1727204627.81798: variable 'ansible_search_path' from source: unknown 46400 1727204627.81800: variable 'ansible_search_path' from source: unknown 46400 1727204627.81829: we have included files to process 46400 1727204627.81830: generating all_blocks data 46400 1727204627.81831: done generating all_blocks data 46400 1727204627.81833: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204627.81834: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204627.81835: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204627.82339: done processing included file 46400 1727204627.82341: iterating over new_blocks loaded from include file 46400 1727204627.82343: in VariableManager get_vars() 46400 1727204627.82377: done with get_vars() 46400 1727204627.82380: filtering new block on tags 46400 1727204627.82413: done filtering new block on tags 46400 1727204627.82416: in VariableManager get_vars() 46400 1727204627.82442: done with get_vars() 46400 1727204627.82445: filtering new block on tags 46400 1727204627.82497: done filtering new block on tags 46400 1727204627.82500: in VariableManager get_vars() 46400 1727204627.82526: done with get_vars() 46400 1727204627.82528: filtering new block on tags 46400 1727204627.82580: done filtering new block on tags 46400 1727204627.82582: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204627.82587: extending task lists for all hosts with included blocks 46400 1727204627.84014: done extending task lists 46400 1727204627.84015: done processing included files 46400 1727204627.84015: results queue empty 46400 1727204627.84016: checking for any_errors_fatal 46400 1727204627.84018: done checking for any_errors_fatal 46400 1727204627.84019: checking for max_fail_percentage 46400 1727204627.84020: done checking for max_fail_percentage 46400 1727204627.84020: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.84021: done checking to see if all hosts have failed 46400 1727204627.84021: getting the remaining hosts for this loop 46400 1727204627.84022: done getting the remaining hosts for this loop 46400 1727204627.84024: getting the next task for host managed-node2 46400 1727204627.84027: done getting next task for host managed-node2 46400 1727204627.84029: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204627.84032: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.84043: getting variables 46400 1727204627.84044: in VariableManager get_vars() 46400 1727204627.84054: Calling all_inventory to load vars for managed-node2 46400 1727204627.84056: Calling groups_inventory to load vars for managed-node2 46400 1727204627.84057: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.84062: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.84065: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.84067: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.84743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.85657: done with get_vars() 46400 1727204627.85676: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.081) 0:01:58.141 ***** 46400 1727204627.85734: entering _queue_task() for managed-node2/setup 46400 1727204627.85999: worker is 1 (out of 1 available) 46400 1727204627.86013: exiting _queue_task() for managed-node2/setup 46400 1727204627.86026: done queuing things up, now waiting for results queue to drain 46400 1727204627.86028: waiting for pending results... 46400 1727204627.86234: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204627.86348: in run() - task 0affcd87-79f5-1303-fda8-0000000024fb 46400 1727204627.86360: variable 'ansible_search_path' from source: unknown 46400 1727204627.86364: variable 'ansible_search_path' from source: unknown 46400 1727204627.86399: calling self._execute() 46400 1727204627.86476: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.86480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.86488: variable 'omit' from source: magic vars 46400 1727204627.86775: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.86786: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.86939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204627.88584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204627.88628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204627.88659: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204627.88694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204627.88715: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204627.88779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204627.88802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204627.88820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204627.88848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204627.88859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204627.88903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204627.88919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204627.88936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204627.88961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204627.88977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204627.89093: variable '__network_required_facts' from source: role '' defaults 46400 1727204627.89101: variable 'ansible_facts' from source: unknown 46400 1727204627.89690: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204627.89694: when evaluation is False, skipping this task 46400 1727204627.89697: _execute() done 46400 1727204627.89700: dumping result to json 46400 1727204627.89702: done dumping result, returning 46400 1727204627.89708: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-0000000024fb] 46400 1727204627.89713: sending task result for task 0affcd87-79f5-1303-fda8-0000000024fb 46400 1727204627.89808: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024fb 46400 1727204627.89811: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204627.89857: no more pending results, returning what we have 46400 1727204627.89861: results queue empty 46400 1727204627.89863: checking for any_errors_fatal 46400 1727204627.89867: done checking for any_errors_fatal 46400 1727204627.89867: checking for max_fail_percentage 46400 1727204627.89869: done checking for max_fail_percentage 46400 1727204627.89870: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.89871: done checking to see if all hosts have failed 46400 1727204627.89871: getting the remaining hosts for this loop 46400 1727204627.89873: done getting the remaining hosts for this loop 46400 1727204627.89877: getting the next task for host managed-node2 46400 1727204627.89888: done getting next task for host managed-node2 46400 1727204627.89893: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204627.89898: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.89935: getting variables 46400 1727204627.89937: in VariableManager get_vars() 46400 1727204627.89985: Calling all_inventory to load vars for managed-node2 46400 1727204627.89987: Calling groups_inventory to load vars for managed-node2 46400 1727204627.89990: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.90000: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.90002: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.90010: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.90973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.91912: done with get_vars() 46400 1727204627.91928: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.062) 0:01:58.204 ***** 46400 1727204627.92011: entering _queue_task() for managed-node2/stat 46400 1727204627.92260: worker is 1 (out of 1 available) 46400 1727204627.92277: exiting _queue_task() for managed-node2/stat 46400 1727204627.92291: done queuing things up, now waiting for results queue to drain 46400 1727204627.92292: waiting for pending results... 46400 1727204627.92492: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204627.92613: in run() - task 0affcd87-79f5-1303-fda8-0000000024fd 46400 1727204627.92628: variable 'ansible_search_path' from source: unknown 46400 1727204627.92631: variable 'ansible_search_path' from source: unknown 46400 1727204627.92663: calling self._execute() 46400 1727204627.92739: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.92750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.92754: variable 'omit' from source: magic vars 46400 1727204627.93085: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.93089: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.93180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204627.93474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204627.93477: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204627.93580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204627.93584: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204627.93600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204627.93619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204627.93637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204627.93656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204627.93728: variable '__network_is_ostree' from source: set_fact 46400 1727204627.93733: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204627.93736: when evaluation is False, skipping this task 46400 1727204627.93739: _execute() done 46400 1727204627.93741: dumping result to json 46400 1727204627.93744: done dumping result, returning 46400 1727204627.93751: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-0000000024fd] 46400 1727204627.93756: sending task result for task 0affcd87-79f5-1303-fda8-0000000024fd 46400 1727204627.93845: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024fd 46400 1727204627.93848: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204627.93902: no more pending results, returning what we have 46400 1727204627.93906: results queue empty 46400 1727204627.93907: checking for any_errors_fatal 46400 1727204627.93917: done checking for any_errors_fatal 46400 1727204627.93918: checking for max_fail_percentage 46400 1727204627.93920: done checking for max_fail_percentage 46400 1727204627.93921: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.93921: done checking to see if all hosts have failed 46400 1727204627.93922: getting the remaining hosts for this loop 46400 1727204627.93924: done getting the remaining hosts for this loop 46400 1727204627.93927: getting the next task for host managed-node2 46400 1727204627.93938: done getting next task for host managed-node2 46400 1727204627.93942: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204627.93948: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.93984: getting variables 46400 1727204627.93986: in VariableManager get_vars() 46400 1727204627.94027: Calling all_inventory to load vars for managed-node2 46400 1727204627.94029: Calling groups_inventory to load vars for managed-node2 46400 1727204627.94032: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.94042: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.94044: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.94047: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.95218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204627.96197: done with get_vars() 46400 1727204627.96214: done getting variables 46400 1727204627.96255: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.042) 0:01:58.247 ***** 46400 1727204627.96285: entering _queue_task() for managed-node2/set_fact 46400 1727204627.96513: worker is 1 (out of 1 available) 46400 1727204627.96526: exiting _queue_task() for managed-node2/set_fact 46400 1727204627.96538: done queuing things up, now waiting for results queue to drain 46400 1727204627.96539: waiting for pending results... 46400 1727204627.96732: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204627.96833: in run() - task 0affcd87-79f5-1303-fda8-0000000024fe 46400 1727204627.96846: variable 'ansible_search_path' from source: unknown 46400 1727204627.96853: variable 'ansible_search_path' from source: unknown 46400 1727204627.96887: calling self._execute() 46400 1727204627.96964: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204627.96971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204627.96983: variable 'omit' from source: magic vars 46400 1727204627.97333: variable 'ansible_distribution_major_version' from source: facts 46400 1727204627.97351: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204627.97537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204627.97826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204627.97886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204627.97923: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204627.97971: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204627.98069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204627.98101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204627.98133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204627.98177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204627.98279: variable '__network_is_ostree' from source: set_fact 46400 1727204627.98294: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204627.98303: when evaluation is False, skipping this task 46400 1727204627.98310: _execute() done 46400 1727204627.98317: dumping result to json 46400 1727204627.98325: done dumping result, returning 46400 1727204627.98336: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-0000000024fe] 46400 1727204627.98346: sending task result for task 0affcd87-79f5-1303-fda8-0000000024fe 46400 1727204627.98476: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024fe 46400 1727204627.98479: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204627.98526: no more pending results, returning what we have 46400 1727204627.98531: results queue empty 46400 1727204627.98532: checking for any_errors_fatal 46400 1727204627.98541: done checking for any_errors_fatal 46400 1727204627.98542: checking for max_fail_percentage 46400 1727204627.98544: done checking for max_fail_percentage 46400 1727204627.98545: checking to see if all hosts have failed and the running result is not ok 46400 1727204627.98545: done checking to see if all hosts have failed 46400 1727204627.98546: getting the remaining hosts for this loop 46400 1727204627.98548: done getting the remaining hosts for this loop 46400 1727204627.98551: getting the next task for host managed-node2 46400 1727204627.98565: done getting next task for host managed-node2 46400 1727204627.98570: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204627.98576: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204627.98609: getting variables 46400 1727204627.98610: in VariableManager get_vars() 46400 1727204627.98656: Calling all_inventory to load vars for managed-node2 46400 1727204627.98659: Calling groups_inventory to load vars for managed-node2 46400 1727204627.98662: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204627.98673: Calling all_plugins_play to load vars for managed-node2 46400 1727204627.98675: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204627.98678: Calling groups_plugins_play to load vars for managed-node2 46400 1727204627.99693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204628.00719: done with get_vars() 46400 1727204628.00742: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.045) 0:01:58.293 ***** 46400 1727204628.00854: entering _queue_task() for managed-node2/service_facts 46400 1727204628.01187: worker is 1 (out of 1 available) 46400 1727204628.01200: exiting _queue_task() for managed-node2/service_facts 46400 1727204628.02582: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204628.02588: in run() - task 0affcd87-79f5-1303-fda8-000000002500 46400 1727204628.02592: variable 'ansible_search_path' from source: unknown 46400 1727204628.02595: variable 'ansible_search_path' from source: unknown 46400 1727204628.02598: calling self._execute() 46400 1727204628.02601: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204628.02603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204628.02605: variable 'omit' from source: magic vars 46400 1727204628.02608: variable 'ansible_distribution_major_version' from source: facts 46400 1727204628.02610: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204628.02613: variable 'omit' from source: magic vars 46400 1727204628.02615: variable 'omit' from source: magic vars 46400 1727204628.02617: variable 'omit' from source: magic vars 46400 1727204628.02620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204628.02622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204628.02625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204628.02627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204628.02629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204628.02632: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204628.02634: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204628.02638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204628.02641: Set connection var ansible_shell_type to sh 46400 1727204628.02643: Set connection var ansible_shell_executable to /bin/sh 46400 1727204628.02645: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204628.02647: Set connection var ansible_connection to ssh 46400 1727204628.02649: Set connection var ansible_pipelining to False 46400 1727204628.02651: Set connection var ansible_timeout to 10 46400 1727204628.02653: variable 'ansible_shell_executable' from source: unknown 46400 1727204628.02655: variable 'ansible_connection' from source: unknown 46400 1727204628.02657: variable 'ansible_module_compression' from source: unknown 46400 1727204628.02659: variable 'ansible_shell_type' from source: unknown 46400 1727204628.02661: variable 'ansible_shell_executable' from source: unknown 46400 1727204628.02662: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204628.02666: variable 'ansible_pipelining' from source: unknown 46400 1727204628.02668: variable 'ansible_timeout' from source: unknown 46400 1727204628.02670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204628.02674: done queuing things up, now waiting for results queue to drain 46400 1727204628.02676: waiting for pending results... 46400 1727204628.02828: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204628.02839: variable 'omit' from source: magic vars 46400 1727204628.02844: starting attempt loop 46400 1727204628.02847: running the handler 46400 1727204628.02859: _low_level_execute_command(): starting 46400 1727204628.02871: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204628.03578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204628.03590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.03601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.03615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.03653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.03659: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204628.03675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.03688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204628.03697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204628.03704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204628.03711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.03721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.03732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.03740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.03747: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204628.03756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.03832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204628.03848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204628.03851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204628.03936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204628.05578: stdout chunk (state=3): >>>/root <<< 46400 1727204628.05742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204628.05746: stdout chunk (state=3): >>><<< 46400 1727204628.05756: stderr chunk (state=3): >>><<< 46400 1727204628.05781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204628.05793: _low_level_execute_command(): starting 46400 1727204628.05799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120 `" && echo ansible-tmp-1727204628.057798-54582-253085867982120="` echo /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120 `" ) && sleep 0' 46400 1727204628.06425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204628.06433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.06444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.06458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.06501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.06508: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204628.06517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.06530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204628.06537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204628.06544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204628.06551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.06560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.06576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.06583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.06590: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204628.06598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.06674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204628.06687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204628.06698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204628.06771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204628.08605: stdout chunk (state=3): >>>ansible-tmp-1727204628.057798-54582-253085867982120=/root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120 <<< 46400 1727204628.08724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204628.08801: stderr chunk (state=3): >>><<< 46400 1727204628.08804: stdout chunk (state=3): >>><<< 46400 1727204628.08823: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204628.057798-54582-253085867982120=/root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204628.08874: variable 'ansible_module_compression' from source: unknown 46400 1727204628.08918: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204628.08956: variable 'ansible_facts' from source: unknown 46400 1727204628.09043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/AnsiballZ_service_facts.py 46400 1727204628.09192: Sending initial data 46400 1727204628.09196: Sent initial data (161 bytes) 46400 1727204628.10154: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204628.10169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.10182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.10192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.10230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.10237: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204628.10246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.10259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204628.10277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204628.10286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204628.10293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.10301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.10314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.10323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.10330: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204628.10339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.10415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204628.10429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204628.10439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204628.10505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204628.12212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204628.12246: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204628.12287: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpngzy1j91 /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/AnsiballZ_service_facts.py <<< 46400 1727204628.12322: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204628.13392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204628.13573: stderr chunk (state=3): >>><<< 46400 1727204628.13576: stdout chunk (state=3): >>><<< 46400 1727204628.13597: done transferring module to remote 46400 1727204628.13609: _low_level_execute_command(): starting 46400 1727204628.13612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/ /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/AnsiballZ_service_facts.py && sleep 0' 46400 1727204628.14224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204628.14233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.14243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.14256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.14299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.14305: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204628.14315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.14328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204628.14335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204628.14343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204628.14350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.14359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.14380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.14387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.14394: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204628.14404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.14478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204628.14496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204628.14509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204628.14582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204628.16454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204628.16459: stdout chunk (state=3): >>><<< 46400 1727204628.16471: stderr chunk (state=3): >>><<< 46400 1727204628.16489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204628.16494: _low_level_execute_command(): starting 46400 1727204628.16496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/AnsiballZ_service_facts.py && sleep 0' 46400 1727204628.17154: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204628.17168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.17179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.17193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.17233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.17239: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204628.17250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.17266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204628.17277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204628.17284: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204628.17291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204628.17300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204628.17312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204628.17319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204628.17326: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204628.17335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204628.17404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204628.17419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204628.17429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204628.17508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.47596: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 46400 1727204629.47634: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204629.48958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204629.48962: stdout chunk (state=3): >>><<< 46400 1727204629.48966: stderr chunk (state=3): >>><<< 46400 1727204629.49080: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204629.50027: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204629.50095: _low_level_execute_command(): starting 46400 1727204629.50105: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204628.057798-54582-253085867982120/ > /dev/null 2>&1 && sleep 0' 46400 1727204629.50818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204629.50841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.50861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.50883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.50926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.50939: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204629.50966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.50986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204629.50999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204629.51011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204629.51024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.51038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.51061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.51080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.51093: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204629.51107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.51192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.51216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.51233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.51311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.53116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204629.53218: stderr chunk (state=3): >>><<< 46400 1727204629.53231: stdout chunk (state=3): >>><<< 46400 1727204629.53376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204629.53380: handler run complete 46400 1727204629.53496: variable 'ansible_facts' from source: unknown 46400 1727204629.53673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204629.54486: variable 'ansible_facts' from source: unknown 46400 1727204629.54741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204629.55172: attempt loop complete, returning result 46400 1727204629.55185: _execute() done 46400 1727204629.55193: dumping result to json 46400 1727204629.55374: done dumping result, returning 46400 1727204629.55389: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-000000002500] 46400 1727204629.55406: sending task result for task 0affcd87-79f5-1303-fda8-000000002500 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204629.56848: no more pending results, returning what we have 46400 1727204629.56853: results queue empty 46400 1727204629.56854: checking for any_errors_fatal 46400 1727204629.56859: done checking for any_errors_fatal 46400 1727204629.56860: checking for max_fail_percentage 46400 1727204629.56862: done checking for max_fail_percentage 46400 1727204629.56865: checking to see if all hosts have failed and the running result is not ok 46400 1727204629.56866: done checking to see if all hosts have failed 46400 1727204629.56867: getting the remaining hosts for this loop 46400 1727204629.56869: done getting the remaining hosts for this loop 46400 1727204629.56874: getting the next task for host managed-node2 46400 1727204629.56884: done getting next task for host managed-node2 46400 1727204629.56888: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204629.56895: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204629.56910: getting variables 46400 1727204629.56911: in VariableManager get_vars() 46400 1727204629.56959: Calling all_inventory to load vars for managed-node2 46400 1727204629.56962: Calling groups_inventory to load vars for managed-node2 46400 1727204629.56972: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204629.56983: Calling all_plugins_play to load vars for managed-node2 46400 1727204629.56986: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204629.56988: Calling groups_plugins_play to load vars for managed-node2 46400 1727204629.58172: done sending task result for task 0affcd87-79f5-1303-fda8-000000002500 46400 1727204629.58176: WORKER PROCESS EXITING 46400 1727204629.60254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204629.64169: done with get_vars() 46400 1727204629.64206: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:49 -0400 (0:00:01.635) 0:01:59.928 ***** 46400 1727204629.64437: entering _queue_task() for managed-node2/package_facts 46400 1727204629.65275: worker is 1 (out of 1 available) 46400 1727204629.65290: exiting _queue_task() for managed-node2/package_facts 46400 1727204629.65303: done queuing things up, now waiting for results queue to drain 46400 1727204629.65305: waiting for pending results... 46400 1727204629.66069: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204629.66445: in run() - task 0affcd87-79f5-1303-fda8-000000002501 46400 1727204629.66459: variable 'ansible_search_path' from source: unknown 46400 1727204629.66463: variable 'ansible_search_path' from source: unknown 46400 1727204629.66509: calling self._execute() 46400 1727204629.66813: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204629.66819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204629.66830: variable 'omit' from source: magic vars 46400 1727204629.67537: variable 'ansible_distribution_major_version' from source: facts 46400 1727204629.67550: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204629.67558: variable 'omit' from source: magic vars 46400 1727204629.67754: variable 'omit' from source: magic vars 46400 1727204629.67797: variable 'omit' from source: magic vars 46400 1727204629.67841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204629.67880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204629.67902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204629.67919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204629.67929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204629.67959: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204629.67965: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204629.67969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204629.68058: Set connection var ansible_shell_type to sh 46400 1727204629.68074: Set connection var ansible_shell_executable to /bin/sh 46400 1727204629.68077: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204629.68083: Set connection var ansible_connection to ssh 46400 1727204629.68088: Set connection var ansible_pipelining to False 46400 1727204629.68094: Set connection var ansible_timeout to 10 46400 1727204629.68119: variable 'ansible_shell_executable' from source: unknown 46400 1727204629.68122: variable 'ansible_connection' from source: unknown 46400 1727204629.68126: variable 'ansible_module_compression' from source: unknown 46400 1727204629.68128: variable 'ansible_shell_type' from source: unknown 46400 1727204629.68131: variable 'ansible_shell_executable' from source: unknown 46400 1727204629.68133: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204629.68136: variable 'ansible_pipelining' from source: unknown 46400 1727204629.68138: variable 'ansible_timeout' from source: unknown 46400 1727204629.68140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204629.68348: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204629.68360: variable 'omit' from source: magic vars 46400 1727204629.68369: starting attempt loop 46400 1727204629.68372: running the handler 46400 1727204629.68390: _low_level_execute_command(): starting 46400 1727204629.68399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204629.70895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.70905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.71054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.71059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.71079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204629.71085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.71277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.71293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.71298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.71380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.73053: stdout chunk (state=3): >>>/root <<< 46400 1727204629.73227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204629.73231: stderr chunk (state=3): >>><<< 46400 1727204629.73234: stdout chunk (state=3): >>><<< 46400 1727204629.73267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204629.73283: _low_level_execute_command(): starting 46400 1727204629.73290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442 `" && echo ansible-tmp-1727204629.7326746-54926-190105689570442="` echo /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442 `" ) && sleep 0' 46400 1727204629.74995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.74999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.75114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.75154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204629.75167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 46400 1727204629.75219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.75226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204629.75244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.75388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.75437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.75440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.75515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.77396: stdout chunk (state=3): >>>ansible-tmp-1727204629.7326746-54926-190105689570442=/root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442 <<< 46400 1727204629.77579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204629.77582: stderr chunk (state=3): >>><<< 46400 1727204629.77585: stdout chunk (state=3): >>><<< 46400 1727204629.77604: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204629.7326746-54926-190105689570442=/root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204629.77657: variable 'ansible_module_compression' from source: unknown 46400 1727204629.77712: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204629.77777: variable 'ansible_facts' from source: unknown 46400 1727204629.77980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/AnsiballZ_package_facts.py 46400 1727204629.78611: Sending initial data 46400 1727204629.78614: Sent initial data (162 bytes) 46400 1727204629.81043: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204629.81187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.81197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.81212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.81255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.81267: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204629.81282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.81295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204629.81304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204629.81312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204629.81320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.81330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.81343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.81352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.81358: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204629.81375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.81446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.81591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.81606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.81831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.83560: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204629.83596: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204629.83631: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp_mh7d7q5 /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/AnsiballZ_package_facts.py <<< 46400 1727204629.83673: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204629.86689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204629.86859: stderr chunk (state=3): >>><<< 46400 1727204629.86863: stdout chunk (state=3): >>><<< 46400 1727204629.86890: done transferring module to remote 46400 1727204629.86901: _low_level_execute_command(): starting 46400 1727204629.86905: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/ /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/AnsiballZ_package_facts.py && sleep 0' 46400 1727204629.88637: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204629.88784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.88797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.88810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.88849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.88856: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204629.88871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.88885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204629.88892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204629.88901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204629.88907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.88916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.88927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.88934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.88941: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204629.88950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.89018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.89086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.89092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.89338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204629.91086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204629.91154: stderr chunk (state=3): >>><<< 46400 1727204629.91158: stdout chunk (state=3): >>><<< 46400 1727204629.91184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204629.91187: _low_level_execute_command(): starting 46400 1727204629.91190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/AnsiballZ_package_facts.py && sleep 0' 46400 1727204629.92847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204629.92985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.92995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.93010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.93052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.93059: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204629.93074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.93089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204629.93096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204629.93103: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204629.93111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204629.93120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204629.93132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204629.93140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204629.93148: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204629.93157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204629.93441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204629.93457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204629.93460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204629.93643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204630.40117: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 46400 1727204630.40132: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204630.40136: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 46400 1727204630.40161: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204630.40220: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204630.40232: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204630.40235: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 46400 1727204630.40241: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 46400 1727204630.40284: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204630.40293: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204630.40298: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204630.40329: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204630.40333: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204630.40362: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204630.41910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204630.42009: stderr chunk (state=3): >>><<< 46400 1727204630.42014: stdout chunk (state=3): >>><<< 46400 1727204630.42076: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204630.44228: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204630.44245: _low_level_execute_command(): starting 46400 1727204630.44250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204629.7326746-54926-190105689570442/ > /dev/null 2>&1 && sleep 0' 46400 1727204630.44790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204630.44796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204630.44832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204630.44857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204630.44886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204630.44955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204630.44961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204630.44965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204630.45008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204630.46832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204630.46906: stderr chunk (state=3): >>><<< 46400 1727204630.46909: stdout chunk (state=3): >>><<< 46400 1727204630.46940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204630.46943: handler run complete 46400 1727204630.47976: variable 'ansible_facts' from source: unknown 46400 1727204630.48514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.50723: variable 'ansible_facts' from source: unknown 46400 1727204630.51212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.52010: attempt loop complete, returning result 46400 1727204630.52023: _execute() done 46400 1727204630.52026: dumping result to json 46400 1727204630.52271: done dumping result, returning 46400 1727204630.52291: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-000000002501] 46400 1727204630.52296: sending task result for task 0affcd87-79f5-1303-fda8-000000002501 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204630.54823: no more pending results, returning what we have 46400 1727204630.54827: results queue empty 46400 1727204630.54828: checking for any_errors_fatal 46400 1727204630.54835: done checking for any_errors_fatal 46400 1727204630.54836: checking for max_fail_percentage 46400 1727204630.54839: done checking for max_fail_percentage 46400 1727204630.54840: checking to see if all hosts have failed and the running result is not ok 46400 1727204630.54841: done checking to see if all hosts have failed 46400 1727204630.54842: getting the remaining hosts for this loop 46400 1727204630.54844: done getting the remaining hosts for this loop 46400 1727204630.54848: getting the next task for host managed-node2 46400 1727204630.54858: done getting next task for host managed-node2 46400 1727204630.54863: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204630.54870: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204630.54886: getting variables 46400 1727204630.54887: in VariableManager get_vars() 46400 1727204630.54928: Calling all_inventory to load vars for managed-node2 46400 1727204630.54931: Calling groups_inventory to load vars for managed-node2 46400 1727204630.54933: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204630.54944: Calling all_plugins_play to load vars for managed-node2 46400 1727204630.54946: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204630.54949: Calling groups_plugins_play to load vars for managed-node2 46400 1727204630.55812: done sending task result for task 0affcd87-79f5-1303-fda8-000000002501 46400 1727204630.55816: WORKER PROCESS EXITING 46400 1727204630.56660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.59820: done with get_vars() 46400 1727204630.59862: done getting variables 46400 1727204630.59944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.955) 0:02:00.884 ***** 46400 1727204630.59993: entering _queue_task() for managed-node2/debug 46400 1727204630.60398: worker is 1 (out of 1 available) 46400 1727204630.60413: exiting _queue_task() for managed-node2/debug 46400 1727204630.60436: done queuing things up, now waiting for results queue to drain 46400 1727204630.60439: waiting for pending results... 46400 1727204630.60831: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204630.61018: in run() - task 0affcd87-79f5-1303-fda8-0000000024a5 46400 1727204630.61039: variable 'ansible_search_path' from source: unknown 46400 1727204630.61046: variable 'ansible_search_path' from source: unknown 46400 1727204630.61101: calling self._execute() 46400 1727204630.61217: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.61230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.61247: variable 'omit' from source: magic vars 46400 1727204630.61697: variable 'ansible_distribution_major_version' from source: facts 46400 1727204630.61714: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204630.61726: variable 'omit' from source: magic vars 46400 1727204630.61833: variable 'omit' from source: magic vars 46400 1727204630.62104: variable 'network_provider' from source: set_fact 46400 1727204630.62130: variable 'omit' from source: magic vars 46400 1727204630.62195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204630.62237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204630.62266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204630.62305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204630.62322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204630.62356: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204630.62367: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.62377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.62496: Set connection var ansible_shell_type to sh 46400 1727204630.62526: Set connection var ansible_shell_executable to /bin/sh 46400 1727204630.62538: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204630.62549: Set connection var ansible_connection to ssh 46400 1727204630.62559: Set connection var ansible_pipelining to False 46400 1727204630.62573: Set connection var ansible_timeout to 10 46400 1727204630.62680: variable 'ansible_shell_executable' from source: unknown 46400 1727204630.62690: variable 'ansible_connection' from source: unknown 46400 1727204630.62697: variable 'ansible_module_compression' from source: unknown 46400 1727204630.62703: variable 'ansible_shell_type' from source: unknown 46400 1727204630.62709: variable 'ansible_shell_executable' from source: unknown 46400 1727204630.62715: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.62721: variable 'ansible_pipelining' from source: unknown 46400 1727204630.62739: variable 'ansible_timeout' from source: unknown 46400 1727204630.62748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.62903: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204630.62921: variable 'omit' from source: magic vars 46400 1727204630.62930: starting attempt loop 46400 1727204630.62936: running the handler 46400 1727204630.63002: handler run complete 46400 1727204630.63022: attempt loop complete, returning result 46400 1727204630.63030: _execute() done 46400 1727204630.63037: dumping result to json 46400 1727204630.63043: done dumping result, returning 46400 1727204630.63055: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-0000000024a5] 46400 1727204630.63082: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a5 ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204630.63266: no more pending results, returning what we have 46400 1727204630.63271: results queue empty 46400 1727204630.63272: checking for any_errors_fatal 46400 1727204630.63283: done checking for any_errors_fatal 46400 1727204630.63284: checking for max_fail_percentage 46400 1727204630.63286: done checking for max_fail_percentage 46400 1727204630.63287: checking to see if all hosts have failed and the running result is not ok 46400 1727204630.63287: done checking to see if all hosts have failed 46400 1727204630.63288: getting the remaining hosts for this loop 46400 1727204630.63290: done getting the remaining hosts for this loop 46400 1727204630.63295: getting the next task for host managed-node2 46400 1727204630.63305: done getting next task for host managed-node2 46400 1727204630.63310: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204630.63315: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204630.63333: getting variables 46400 1727204630.63334: in VariableManager get_vars() 46400 1727204630.63391: Calling all_inventory to load vars for managed-node2 46400 1727204630.63394: Calling groups_inventory to load vars for managed-node2 46400 1727204630.63397: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204630.63409: Calling all_plugins_play to load vars for managed-node2 46400 1727204630.63411: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204630.63414: Calling groups_plugins_play to load vars for managed-node2 46400 1727204630.64456: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a5 46400 1727204630.64459: WORKER PROCESS EXITING 46400 1727204630.65365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.66845: done with get_vars() 46400 1727204630.66873: done getting variables 46400 1727204630.66920: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.069) 0:02:00.954 ***** 46400 1727204630.66954: entering _queue_task() for managed-node2/fail 46400 1727204630.67208: worker is 1 (out of 1 available) 46400 1727204630.67223: exiting _queue_task() for managed-node2/fail 46400 1727204630.67237: done queuing things up, now waiting for results queue to drain 46400 1727204630.67239: waiting for pending results... 46400 1727204630.67445: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204630.67563: in run() - task 0affcd87-79f5-1303-fda8-0000000024a6 46400 1727204630.67579: variable 'ansible_search_path' from source: unknown 46400 1727204630.67583: variable 'ansible_search_path' from source: unknown 46400 1727204630.67614: calling self._execute() 46400 1727204630.67695: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.67698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.67708: variable 'omit' from source: magic vars 46400 1727204630.68004: variable 'ansible_distribution_major_version' from source: facts 46400 1727204630.68013: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204630.68120: variable 'network_state' from source: role '' defaults 46400 1727204630.68151: Evaluated conditional (network_state != {}): False 46400 1727204630.68185: when evaluation is False, skipping this task 46400 1727204630.68193: _execute() done 46400 1727204630.68197: dumping result to json 46400 1727204630.68199: done dumping result, returning 46400 1727204630.68206: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-0000000024a6] 46400 1727204630.68212: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a6 46400 1727204630.68420: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a6 46400 1727204630.68424: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204630.68479: no more pending results, returning what we have 46400 1727204630.68484: results queue empty 46400 1727204630.68485: checking for any_errors_fatal 46400 1727204630.68496: done checking for any_errors_fatal 46400 1727204630.68497: checking for max_fail_percentage 46400 1727204630.68499: done checking for max_fail_percentage 46400 1727204630.68500: checking to see if all hosts have failed and the running result is not ok 46400 1727204630.68501: done checking to see if all hosts have failed 46400 1727204630.68501: getting the remaining hosts for this loop 46400 1727204630.68503: done getting the remaining hosts for this loop 46400 1727204630.68508: getting the next task for host managed-node2 46400 1727204630.68517: done getting next task for host managed-node2 46400 1727204630.68521: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204630.68525: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204630.68559: getting variables 46400 1727204630.68562: in VariableManager get_vars() 46400 1727204630.68614: Calling all_inventory to load vars for managed-node2 46400 1727204630.68617: Calling groups_inventory to load vars for managed-node2 46400 1727204630.68619: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204630.68631: Calling all_plugins_play to load vars for managed-node2 46400 1727204630.68633: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204630.68636: Calling groups_plugins_play to load vars for managed-node2 46400 1727204630.69984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.71046: done with get_vars() 46400 1727204630.71067: done getting variables 46400 1727204630.71114: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.041) 0:02:00.996 ***** 46400 1727204630.71144: entering _queue_task() for managed-node2/fail 46400 1727204630.71398: worker is 1 (out of 1 available) 46400 1727204630.71413: exiting _queue_task() for managed-node2/fail 46400 1727204630.71427: done queuing things up, now waiting for results queue to drain 46400 1727204630.71429: waiting for pending results... 46400 1727204630.71635: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204630.71744: in run() - task 0affcd87-79f5-1303-fda8-0000000024a7 46400 1727204630.71759: variable 'ansible_search_path' from source: unknown 46400 1727204630.71762: variable 'ansible_search_path' from source: unknown 46400 1727204630.71800: calling self._execute() 46400 1727204630.71880: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.71883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.71892: variable 'omit' from source: magic vars 46400 1727204630.72185: variable 'ansible_distribution_major_version' from source: facts 46400 1727204630.72196: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204630.72333: variable 'network_state' from source: role '' defaults 46400 1727204630.72350: Evaluated conditional (network_state != {}): False 46400 1727204630.72360: when evaluation is False, skipping this task 46400 1727204630.72374: _execute() done 46400 1727204630.72382: dumping result to json 46400 1727204630.72390: done dumping result, returning 46400 1727204630.72402: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-0000000024a7] 46400 1727204630.72418: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a7 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204630.72596: no more pending results, returning what we have 46400 1727204630.72601: results queue empty 46400 1727204630.72603: checking for any_errors_fatal 46400 1727204630.72615: done checking for any_errors_fatal 46400 1727204630.72616: checking for max_fail_percentage 46400 1727204630.72618: done checking for max_fail_percentage 46400 1727204630.72619: checking to see if all hosts have failed and the running result is not ok 46400 1727204630.72620: done checking to see if all hosts have failed 46400 1727204630.72620: getting the remaining hosts for this loop 46400 1727204630.72623: done getting the remaining hosts for this loop 46400 1727204630.72627: getting the next task for host managed-node2 46400 1727204630.72638: done getting next task for host managed-node2 46400 1727204630.72643: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204630.72655: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204630.72696: getting variables 46400 1727204630.72699: in VariableManager get_vars() 46400 1727204630.72747: Calling all_inventory to load vars for managed-node2 46400 1727204630.72750: Calling groups_inventory to load vars for managed-node2 46400 1727204630.72752: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204630.72874: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a7 46400 1727204630.72878: WORKER PROCESS EXITING 46400 1727204630.72891: Calling all_plugins_play to load vars for managed-node2 46400 1727204630.72895: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204630.72898: Calling groups_plugins_play to load vars for managed-node2 46400 1727204630.74656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.87097: done with get_vars() 46400 1727204630.87130: done getting variables 46400 1727204630.87191: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.160) 0:02:01.156 ***** 46400 1727204630.87226: entering _queue_task() for managed-node2/fail 46400 1727204630.87679: worker is 1 (out of 1 available) 46400 1727204630.87693: exiting _queue_task() for managed-node2/fail 46400 1727204630.87707: done queuing things up, now waiting for results queue to drain 46400 1727204630.87713: waiting for pending results... 46400 1727204630.88020: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204630.88221: in run() - task 0affcd87-79f5-1303-fda8-0000000024a8 46400 1727204630.88245: variable 'ansible_search_path' from source: unknown 46400 1727204630.88259: variable 'ansible_search_path' from source: unknown 46400 1727204630.88307: calling self._execute() 46400 1727204630.88419: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204630.88431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204630.88446: variable 'omit' from source: magic vars 46400 1727204630.88890: variable 'ansible_distribution_major_version' from source: facts 46400 1727204630.88913: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204630.89104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204630.92915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204630.93142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204630.93212: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204630.93315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204630.93415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204630.93522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204630.93561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204630.93598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204630.93653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204630.93679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204630.93810: variable 'ansible_distribution_major_version' from source: facts 46400 1727204630.93862: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204630.93874: when evaluation is False, skipping this task 46400 1727204630.93886: _execute() done 46400 1727204630.93893: dumping result to json 46400 1727204630.93900: done dumping result, returning 46400 1727204630.93911: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-0000000024a8] 46400 1727204630.93937: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a8 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204630.94141: no more pending results, returning what we have 46400 1727204630.94146: results queue empty 46400 1727204630.94147: checking for any_errors_fatal 46400 1727204630.94158: done checking for any_errors_fatal 46400 1727204630.94159: checking for max_fail_percentage 46400 1727204630.94161: done checking for max_fail_percentage 46400 1727204630.94162: checking to see if all hosts have failed and the running result is not ok 46400 1727204630.94163: done checking to see if all hosts have failed 46400 1727204630.94165: getting the remaining hosts for this loop 46400 1727204630.94167: done getting the remaining hosts for this loop 46400 1727204630.94173: getting the next task for host managed-node2 46400 1727204630.94184: done getting next task for host managed-node2 46400 1727204630.94190: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204630.94197: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204630.94231: getting variables 46400 1727204630.94234: in VariableManager get_vars() 46400 1727204630.94284: Calling all_inventory to load vars for managed-node2 46400 1727204630.94286: Calling groups_inventory to load vars for managed-node2 46400 1727204630.94288: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204630.94300: Calling all_plugins_play to load vars for managed-node2 46400 1727204630.94302: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204630.94305: Calling groups_plugins_play to load vars for managed-node2 46400 1727204630.95871: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a8 46400 1727204630.95876: WORKER PROCESS EXITING 46400 1727204630.96759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204630.98990: done with get_vars() 46400 1727204630.99028: done getting variables 46400 1727204630.99107: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.119) 0:02:01.276 ***** 46400 1727204630.99229: entering _queue_task() for managed-node2/dnf 46400 1727204630.99698: worker is 1 (out of 1 available) 46400 1727204630.99717: exiting _queue_task() for managed-node2/dnf 46400 1727204630.99731: done queuing things up, now waiting for results queue to drain 46400 1727204630.99732: waiting for pending results... 46400 1727204631.00061: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204631.00224: in run() - task 0affcd87-79f5-1303-fda8-0000000024a9 46400 1727204631.00239: variable 'ansible_search_path' from source: unknown 46400 1727204631.00243: variable 'ansible_search_path' from source: unknown 46400 1727204631.00293: calling self._execute() 46400 1727204631.00405: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.00414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.00420: variable 'omit' from source: magic vars 46400 1727204631.00843: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.00853: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.01062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.04920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.05135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.05278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.05347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.05392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.05467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.05487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.05523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.05550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.05568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.05662: variable 'ansible_distribution' from source: facts 46400 1727204631.05671: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.05685: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204631.05801: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.05890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.05907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.05926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.05955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.05971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.06000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.06016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.06034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.06068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.06080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.06108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.06124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.06142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.06174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.06185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.06295: variable 'network_connections' from source: include params 46400 1727204631.06305: variable 'interface' from source: play vars 46400 1727204631.06351: variable 'interface' from source: play vars 46400 1727204631.06408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204631.06876: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204631.06907: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204631.06931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204631.06953: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204631.06991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204631.07007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204631.07032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.07051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204631.07118: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204631.08181: variable 'network_connections' from source: include params 46400 1727204631.08184: variable 'interface' from source: play vars 46400 1727204631.08186: variable 'interface' from source: play vars 46400 1727204631.08189: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204631.08190: when evaluation is False, skipping this task 46400 1727204631.08192: _execute() done 46400 1727204631.08194: dumping result to json 46400 1727204631.08196: done dumping result, returning 46400 1727204631.08198: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000024a9] 46400 1727204631.08200: sending task result for task 0affcd87-79f5-1303-fda8-0000000024a9 46400 1727204631.08281: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024a9 46400 1727204631.08285: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204631.08343: no more pending results, returning what we have 46400 1727204631.08349: results queue empty 46400 1727204631.08351: checking for any_errors_fatal 46400 1727204631.08356: done checking for any_errors_fatal 46400 1727204631.08357: checking for max_fail_percentage 46400 1727204631.08359: done checking for max_fail_percentage 46400 1727204631.08360: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.08361: done checking to see if all hosts have failed 46400 1727204631.08362: getting the remaining hosts for this loop 46400 1727204631.08365: done getting the remaining hosts for this loop 46400 1727204631.08370: getting the next task for host managed-node2 46400 1727204631.08379: done getting next task for host managed-node2 46400 1727204631.08384: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204631.08389: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.08421: getting variables 46400 1727204631.08424: in VariableManager get_vars() 46400 1727204631.08479: Calling all_inventory to load vars for managed-node2 46400 1727204631.08482: Calling groups_inventory to load vars for managed-node2 46400 1727204631.08485: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.08496: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.08499: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.08501: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.11472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.12859: done with get_vars() 46400 1727204631.12884: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204631.12943: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.137) 0:02:01.414 ***** 46400 1727204631.12972: entering _queue_task() for managed-node2/yum 46400 1727204631.13269: worker is 1 (out of 1 available) 46400 1727204631.13283: exiting _queue_task() for managed-node2/yum 46400 1727204631.13294: done queuing things up, now waiting for results queue to drain 46400 1727204631.13295: waiting for pending results... 46400 1727204631.13507: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204631.13681: in run() - task 0affcd87-79f5-1303-fda8-0000000024aa 46400 1727204631.13707: variable 'ansible_search_path' from source: unknown 46400 1727204631.13720: variable 'ansible_search_path' from source: unknown 46400 1727204631.13777: calling self._execute() 46400 1727204631.13931: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.13949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.13982: variable 'omit' from source: magic vars 46400 1727204631.14511: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.14531: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.14733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.17936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.17998: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.18029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.18057: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.18081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.18142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.18168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.18187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.18213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.18224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.18304: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.18317: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204631.18320: when evaluation is False, skipping this task 46400 1727204631.18323: _execute() done 46400 1727204631.18325: dumping result to json 46400 1727204631.18328: done dumping result, returning 46400 1727204631.18335: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000024aa] 46400 1727204631.18340: sending task result for task 0affcd87-79f5-1303-fda8-0000000024aa 46400 1727204631.18437: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024aa 46400 1727204631.18441: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204631.18513: no more pending results, returning what we have 46400 1727204631.18517: results queue empty 46400 1727204631.18518: checking for any_errors_fatal 46400 1727204631.18527: done checking for any_errors_fatal 46400 1727204631.18528: checking for max_fail_percentage 46400 1727204631.18529: done checking for max_fail_percentage 46400 1727204631.18530: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.18531: done checking to see if all hosts have failed 46400 1727204631.18532: getting the remaining hosts for this loop 46400 1727204631.18533: done getting the remaining hosts for this loop 46400 1727204631.18538: getting the next task for host managed-node2 46400 1727204631.18545: done getting next task for host managed-node2 46400 1727204631.18550: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204631.18555: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.18620: getting variables 46400 1727204631.18622: in VariableManager get_vars() 46400 1727204631.18670: Calling all_inventory to load vars for managed-node2 46400 1727204631.18674: Calling groups_inventory to load vars for managed-node2 46400 1727204631.18676: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.18688: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.18691: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.18696: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.22401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.24839: done with get_vars() 46400 1727204631.24884: done getting variables 46400 1727204631.24957: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.120) 0:02:01.534 ***** 46400 1727204631.25009: entering _queue_task() for managed-node2/fail 46400 1727204631.25398: worker is 1 (out of 1 available) 46400 1727204631.25410: exiting _queue_task() for managed-node2/fail 46400 1727204631.25433: done queuing things up, now waiting for results queue to drain 46400 1727204631.25435: waiting for pending results... 46400 1727204631.25826: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204631.26009: in run() - task 0affcd87-79f5-1303-fda8-0000000024ab 46400 1727204631.26030: variable 'ansible_search_path' from source: unknown 46400 1727204631.26039: variable 'ansible_search_path' from source: unknown 46400 1727204631.26083: calling self._execute() 46400 1727204631.26202: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.26225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.26240: variable 'omit' from source: magic vars 46400 1727204631.26667: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.26685: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.26824: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.27056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.30302: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.30385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.30451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.30499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.30539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.30637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.30678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.30715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.30771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.30802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.30860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.30897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.30930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.30983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.31009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.31066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.31095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.31134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.31187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.31206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.31420: variable 'network_connections' from source: include params 46400 1727204631.31447: variable 'interface' from source: play vars 46400 1727204631.31538: variable 'interface' from source: play vars 46400 1727204631.31634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204631.31843: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204631.31896: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204631.31939: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204631.31989: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204631.32043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204631.32071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204631.32126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.32168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204631.32245: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204631.32658: variable 'network_connections' from source: include params 46400 1727204631.32679: variable 'interface' from source: play vars 46400 1727204631.32793: variable 'interface' from source: play vars 46400 1727204631.32835: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204631.32846: when evaluation is False, skipping this task 46400 1727204631.32855: _execute() done 46400 1727204631.32861: dumping result to json 46400 1727204631.32872: done dumping result, returning 46400 1727204631.32887: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000024ab] 46400 1727204631.32908: sending task result for task 0affcd87-79f5-1303-fda8-0000000024ab skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204631.33137: no more pending results, returning what we have 46400 1727204631.33142: results queue empty 46400 1727204631.33143: checking for any_errors_fatal 46400 1727204631.33149: done checking for any_errors_fatal 46400 1727204631.33150: checking for max_fail_percentage 46400 1727204631.33152: done checking for max_fail_percentage 46400 1727204631.33153: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.33154: done checking to see if all hosts have failed 46400 1727204631.33154: getting the remaining hosts for this loop 46400 1727204631.33159: done getting the remaining hosts for this loop 46400 1727204631.33168: getting the next task for host managed-node2 46400 1727204631.33178: done getting next task for host managed-node2 46400 1727204631.33183: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204631.33188: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.33231: getting variables 46400 1727204631.33233: in VariableManager get_vars() 46400 1727204631.33286: Calling all_inventory to load vars for managed-node2 46400 1727204631.33289: Calling groups_inventory to load vars for managed-node2 46400 1727204631.33291: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.33302: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.33305: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.33307: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.34337: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024ab 46400 1727204631.34341: WORKER PROCESS EXITING 46400 1727204631.36834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.38809: done with get_vars() 46400 1727204631.38993: done getting variables 46400 1727204631.39390: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.145) 0:02:01.680 ***** 46400 1727204631.39598: entering _queue_task() for managed-node2/package 46400 1727204631.40658: worker is 1 (out of 1 available) 46400 1727204631.40680: exiting _queue_task() for managed-node2/package 46400 1727204631.40765: done queuing things up, now waiting for results queue to drain 46400 1727204631.40768: waiting for pending results... 46400 1727204631.41498: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204631.41615: in run() - task 0affcd87-79f5-1303-fda8-0000000024ac 46400 1727204631.41626: variable 'ansible_search_path' from source: unknown 46400 1727204631.41629: variable 'ansible_search_path' from source: unknown 46400 1727204631.41667: calling self._execute() 46400 1727204631.41751: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.41754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.41764: variable 'omit' from source: magic vars 46400 1727204631.42082: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.42092: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.42238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204631.42462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204631.42484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204631.42514: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204631.42573: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204631.42657: variable 'network_packages' from source: role '' defaults 46400 1727204631.42736: variable '__network_provider_setup' from source: role '' defaults 46400 1727204631.42744: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204631.42794: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204631.42801: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204631.42849: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204631.42968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.44989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.44993: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.44996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.45014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.45396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.45400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.45404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.45407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.45409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.45413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.45416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.45421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.45424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.45426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.45428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.45667: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204631.45994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.45998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.46000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.46003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.46005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.46007: variable 'ansible_python' from source: facts 46400 1727204631.46009: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204631.46066: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204631.46171: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204631.46305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.46341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.46355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.46390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.46404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.46454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.46480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.46505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.46549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.46564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.46731: variable 'network_connections' from source: include params 46400 1727204631.46734: variable 'interface' from source: play vars 46400 1727204631.46849: variable 'interface' from source: play vars 46400 1727204631.46932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204631.46967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204631.46996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.47030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204631.47084: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.47412: variable 'network_connections' from source: include params 46400 1727204631.47416: variable 'interface' from source: play vars 46400 1727204631.47522: variable 'interface' from source: play vars 46400 1727204631.47557: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204631.47652: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.47976: variable 'network_connections' from source: include params 46400 1727204631.47979: variable 'interface' from source: play vars 46400 1727204631.48042: variable 'interface' from source: play vars 46400 1727204631.48066: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204631.48150: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204631.48541: variable 'network_connections' from source: include params 46400 1727204631.48547: variable 'interface' from source: play vars 46400 1727204631.48624: variable 'interface' from source: play vars 46400 1727204631.48699: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204631.48759: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204631.48776: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204631.48849: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204631.49092: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204631.49605: variable 'network_connections' from source: include params 46400 1727204631.49608: variable 'interface' from source: play vars 46400 1727204631.49654: variable 'interface' from source: play vars 46400 1727204631.49665: variable 'ansible_distribution' from source: facts 46400 1727204631.49668: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.49674: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.49685: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204631.49797: variable 'ansible_distribution' from source: facts 46400 1727204631.49800: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.49805: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.49817: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204631.49929: variable 'ansible_distribution' from source: facts 46400 1727204631.49933: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.49935: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.49972: variable 'network_provider' from source: set_fact 46400 1727204631.49984: variable 'ansible_facts' from source: unknown 46400 1727204631.50496: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204631.50500: when evaluation is False, skipping this task 46400 1727204631.50502: _execute() done 46400 1727204631.50506: dumping result to json 46400 1727204631.50508: done dumping result, returning 46400 1727204631.50517: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-0000000024ac] 46400 1727204631.50520: sending task result for task 0affcd87-79f5-1303-fda8-0000000024ac 46400 1727204631.50618: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024ac 46400 1727204631.50621: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204631.50678: no more pending results, returning what we have 46400 1727204631.50683: results queue empty 46400 1727204631.50684: checking for any_errors_fatal 46400 1727204631.50690: done checking for any_errors_fatal 46400 1727204631.50691: checking for max_fail_percentage 46400 1727204631.50693: done checking for max_fail_percentage 46400 1727204631.50694: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.50694: done checking to see if all hosts have failed 46400 1727204631.50695: getting the remaining hosts for this loop 46400 1727204631.50697: done getting the remaining hosts for this loop 46400 1727204631.50701: getting the next task for host managed-node2 46400 1727204631.50709: done getting next task for host managed-node2 46400 1727204631.50713: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204631.50718: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.50749: getting variables 46400 1727204631.50751: in VariableManager get_vars() 46400 1727204631.50826: Calling all_inventory to load vars for managed-node2 46400 1727204631.50828: Calling groups_inventory to load vars for managed-node2 46400 1727204631.50831: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.50842: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.50845: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.50848: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.52140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.53909: done with get_vars() 46400 1727204631.53932: done getting variables 46400 1727204631.53984: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.144) 0:02:01.824 ***** 46400 1727204631.54012: entering _queue_task() for managed-node2/package 46400 1727204631.54277: worker is 1 (out of 1 available) 46400 1727204631.54290: exiting _queue_task() for managed-node2/package 46400 1727204631.54304: done queuing things up, now waiting for results queue to drain 46400 1727204631.54305: waiting for pending results... 46400 1727204631.54578: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204631.54705: in run() - task 0affcd87-79f5-1303-fda8-0000000024ad 46400 1727204631.54717: variable 'ansible_search_path' from source: unknown 46400 1727204631.54722: variable 'ansible_search_path' from source: unknown 46400 1727204631.54752: calling self._execute() 46400 1727204631.54846: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.54850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.54858: variable 'omit' from source: magic vars 46400 1727204631.55237: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.55251: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.55345: variable 'network_state' from source: role '' defaults 46400 1727204631.55354: Evaluated conditional (network_state != {}): False 46400 1727204631.55358: when evaluation is False, skipping this task 46400 1727204631.55378: _execute() done 46400 1727204631.55382: dumping result to json 46400 1727204631.55384: done dumping result, returning 46400 1727204631.55389: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000024ad] 46400 1727204631.55391: sending task result for task 0affcd87-79f5-1303-fda8-0000000024ad 46400 1727204631.55491: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024ad 46400 1727204631.55494: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204631.55660: no more pending results, returning what we have 46400 1727204631.55669: results queue empty 46400 1727204631.55670: checking for any_errors_fatal 46400 1727204631.55676: done checking for any_errors_fatal 46400 1727204631.55677: checking for max_fail_percentage 46400 1727204631.55678: done checking for max_fail_percentage 46400 1727204631.55679: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.55680: done checking to see if all hosts have failed 46400 1727204631.55681: getting the remaining hosts for this loop 46400 1727204631.55682: done getting the remaining hosts for this loop 46400 1727204631.55686: getting the next task for host managed-node2 46400 1727204631.55693: done getting next task for host managed-node2 46400 1727204631.55698: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204631.55702: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.55735: getting variables 46400 1727204631.55736: in VariableManager get_vars() 46400 1727204631.55773: Calling all_inventory to load vars for managed-node2 46400 1727204631.55775: Calling groups_inventory to load vars for managed-node2 46400 1727204631.55776: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.55784: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.55786: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.55787: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.56829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.57870: done with get_vars() 46400 1727204631.57900: done getting variables 46400 1727204631.57962: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.039) 0:02:01.864 ***** 46400 1727204631.57989: entering _queue_task() for managed-node2/package 46400 1727204631.58270: worker is 1 (out of 1 available) 46400 1727204631.58284: exiting _queue_task() for managed-node2/package 46400 1727204631.58298: done queuing things up, now waiting for results queue to drain 46400 1727204631.58300: waiting for pending results... 46400 1727204631.58549: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204631.58654: in run() - task 0affcd87-79f5-1303-fda8-0000000024ae 46400 1727204631.58707: variable 'ansible_search_path' from source: unknown 46400 1727204631.58757: variable 'ansible_search_path' from source: unknown 46400 1727204631.58761: calling self._execute() 46400 1727204631.58865: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.58869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.58873: variable 'omit' from source: magic vars 46400 1727204631.59230: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.59256: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.59339: variable 'network_state' from source: role '' defaults 46400 1727204631.59348: Evaluated conditional (network_state != {}): False 46400 1727204631.59352: when evaluation is False, skipping this task 46400 1727204631.59355: _execute() done 46400 1727204631.59358: dumping result to json 46400 1727204631.59360: done dumping result, returning 46400 1727204631.59370: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-0000000024ae] 46400 1727204631.59376: sending task result for task 0affcd87-79f5-1303-fda8-0000000024ae 46400 1727204631.59469: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024ae 46400 1727204631.59471: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204631.59553: no more pending results, returning what we have 46400 1727204631.59558: results queue empty 46400 1727204631.59559: checking for any_errors_fatal 46400 1727204631.59570: done checking for any_errors_fatal 46400 1727204631.59571: checking for max_fail_percentage 46400 1727204631.59572: done checking for max_fail_percentage 46400 1727204631.59573: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.59574: done checking to see if all hosts have failed 46400 1727204631.59575: getting the remaining hosts for this loop 46400 1727204631.59577: done getting the remaining hosts for this loop 46400 1727204631.59589: getting the next task for host managed-node2 46400 1727204631.59597: done getting next task for host managed-node2 46400 1727204631.59601: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204631.59606: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.59631: getting variables 46400 1727204631.59632: in VariableManager get_vars() 46400 1727204631.59671: Calling all_inventory to load vars for managed-node2 46400 1727204631.59674: Calling groups_inventory to load vars for managed-node2 46400 1727204631.59676: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.59685: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.59695: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.59699: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.60884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.62404: done with get_vars() 46400 1727204631.62441: done getting variables 46400 1727204631.62526: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.045) 0:02:01.910 ***** 46400 1727204631.62566: entering _queue_task() for managed-node2/service 46400 1727204631.62914: worker is 1 (out of 1 available) 46400 1727204631.62927: exiting _queue_task() for managed-node2/service 46400 1727204631.62941: done queuing things up, now waiting for results queue to drain 46400 1727204631.62942: waiting for pending results... 46400 1727204631.63298: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204631.63499: in run() - task 0affcd87-79f5-1303-fda8-0000000024af 46400 1727204631.63504: variable 'ansible_search_path' from source: unknown 46400 1727204631.63507: variable 'ansible_search_path' from source: unknown 46400 1727204631.63511: calling self._execute() 46400 1727204631.63611: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.63614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.63656: variable 'omit' from source: magic vars 46400 1727204631.65267: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.65279: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.65415: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.66137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.70511: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.70577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.70607: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.70681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.70708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.70893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.70897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.70900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.70902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.70922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.71023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.71026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.71029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.71077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.71080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.71132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.71135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.71375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.71378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.71381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.71485: variable 'network_connections' from source: include params 46400 1727204631.71489: variable 'interface' from source: play vars 46400 1727204631.71491: variable 'interface' from source: play vars 46400 1727204631.71542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204631.71712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204631.71760: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204631.71795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204631.71822: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204631.72274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204631.72277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204631.72279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.72282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204631.72284: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204631.72287: variable 'network_connections' from source: include params 46400 1727204631.72289: variable 'interface' from source: play vars 46400 1727204631.72337: variable 'interface' from source: play vars 46400 1727204631.72360: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204631.72368: when evaluation is False, skipping this task 46400 1727204631.72690: _execute() done 46400 1727204631.72693: dumping result to json 46400 1727204631.72696: done dumping result, returning 46400 1727204631.72705: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-0000000024af] 46400 1727204631.72714: sending task result for task 0affcd87-79f5-1303-fda8-0000000024af 46400 1727204631.72807: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024af skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204631.72866: no more pending results, returning what we have 46400 1727204631.72871: results queue empty 46400 1727204631.72872: checking for any_errors_fatal 46400 1727204631.72879: done checking for any_errors_fatal 46400 1727204631.72880: checking for max_fail_percentage 46400 1727204631.72882: done checking for max_fail_percentage 46400 1727204631.72883: checking to see if all hosts have failed and the running result is not ok 46400 1727204631.72884: done checking to see if all hosts have failed 46400 1727204631.72884: getting the remaining hosts for this loop 46400 1727204631.72886: done getting the remaining hosts for this loop 46400 1727204631.72891: getting the next task for host managed-node2 46400 1727204631.72900: done getting next task for host managed-node2 46400 1727204631.72905: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204631.72910: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204631.72945: getting variables 46400 1727204631.72947: in VariableManager get_vars() 46400 1727204631.72998: Calling all_inventory to load vars for managed-node2 46400 1727204631.73001: Calling groups_inventory to load vars for managed-node2 46400 1727204631.73003: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204631.73014: Calling all_plugins_play to load vars for managed-node2 46400 1727204631.73017: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204631.73020: Calling groups_plugins_play to load vars for managed-node2 46400 1727204631.73621: WORKER PROCESS EXITING 46400 1727204631.74915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204631.77809: done with get_vars() 46400 1727204631.77847: done getting variables 46400 1727204631.78402: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.158) 0:02:02.068 ***** 46400 1727204631.78440: entering _queue_task() for managed-node2/service 46400 1727204631.79017: worker is 1 (out of 1 available) 46400 1727204631.79030: exiting _queue_task() for managed-node2/service 46400 1727204631.79041: done queuing things up, now waiting for results queue to drain 46400 1727204631.79043: waiting for pending results... 46400 1727204631.79641: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204631.79769: in run() - task 0affcd87-79f5-1303-fda8-0000000024b0 46400 1727204631.79784: variable 'ansible_search_path' from source: unknown 46400 1727204631.79788: variable 'ansible_search_path' from source: unknown 46400 1727204631.79832: calling self._execute() 46400 1727204631.79954: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.79961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.79976: variable 'omit' from source: magic vars 46400 1727204631.80433: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.80446: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204631.80667: variable 'network_provider' from source: set_fact 46400 1727204631.80671: variable 'network_state' from source: role '' defaults 46400 1727204631.80684: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204631.80696: variable 'omit' from source: magic vars 46400 1727204631.80772: variable 'omit' from source: magic vars 46400 1727204631.80804: variable 'network_service_name' from source: role '' defaults 46400 1727204631.80873: variable 'network_service_name' from source: role '' defaults 46400 1727204631.80983: variable '__network_provider_setup' from source: role '' defaults 46400 1727204631.80988: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204631.81055: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204631.81067: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204631.81133: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204631.82109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204631.86784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204631.87449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204631.87493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204631.87530: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204631.87556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204631.87647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.87685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.87708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.87750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.87767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.87812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.87853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.87865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.87907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.87922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.88167: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204631.88281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.88302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.88325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.88360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.88381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.88475: variable 'ansible_python' from source: facts 46400 1727204631.88492: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204631.88574: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204631.88648: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204631.88768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.88794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.88820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.88859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.88877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.88934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204631.88963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204631.88991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.89033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204631.89051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204631.89431: variable 'network_connections' from source: include params 46400 1727204631.89440: variable 'interface' from source: play vars 46400 1727204631.89521: variable 'interface' from source: play vars 46400 1727204631.89633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204631.89947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204631.90149: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204631.90293: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204631.90334: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204631.90506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204631.90536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204631.90571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204631.90698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204631.90845: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.91355: variable 'network_connections' from source: include params 46400 1727204631.91385: variable 'interface' from source: play vars 46400 1727204631.91474: variable 'interface' from source: play vars 46400 1727204631.91522: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204631.91614: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204631.91924: variable 'network_connections' from source: include params 46400 1727204631.91928: variable 'interface' from source: play vars 46400 1727204631.92010: variable 'interface' from source: play vars 46400 1727204631.92033: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204631.92123: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204631.92425: variable 'network_connections' from source: include params 46400 1727204631.92428: variable 'interface' from source: play vars 46400 1727204631.92506: variable 'interface' from source: play vars 46400 1727204631.92557: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204631.92621: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204631.92628: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204631.92700: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204631.92924: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204631.93425: variable 'network_connections' from source: include params 46400 1727204631.93440: variable 'interface' from source: play vars 46400 1727204631.93502: variable 'interface' from source: play vars 46400 1727204631.93509: variable 'ansible_distribution' from source: facts 46400 1727204631.93512: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.93519: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.93533: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204631.94012: variable 'ansible_distribution' from source: facts 46400 1727204631.94026: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.94036: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.94054: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204631.94473: variable 'ansible_distribution' from source: facts 46400 1727204631.94484: variable '__network_rh_distros' from source: role '' defaults 46400 1727204631.94496: variable 'ansible_distribution_major_version' from source: facts 46400 1727204631.94575: variable 'network_provider' from source: set_fact 46400 1727204631.94604: variable 'omit' from source: magic vars 46400 1727204631.94637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204631.94677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204631.94709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204631.94731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204631.94747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204631.94788: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204631.94800: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.94808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.95830: Set connection var ansible_shell_type to sh 46400 1727204631.95846: Set connection var ansible_shell_executable to /bin/sh 46400 1727204631.95856: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204631.95869: Set connection var ansible_connection to ssh 46400 1727204631.95879: Set connection var ansible_pipelining to False 46400 1727204631.95889: Set connection var ansible_timeout to 10 46400 1727204631.95929: variable 'ansible_shell_executable' from source: unknown 46400 1727204631.96030: variable 'ansible_connection' from source: unknown 46400 1727204631.96039: variable 'ansible_module_compression' from source: unknown 46400 1727204631.96046: variable 'ansible_shell_type' from source: unknown 46400 1727204631.96053: variable 'ansible_shell_executable' from source: unknown 46400 1727204631.96059: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204631.96069: variable 'ansible_pipelining' from source: unknown 46400 1727204631.96077: variable 'ansible_timeout' from source: unknown 46400 1727204631.96084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204631.96199: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204631.96361: variable 'omit' from source: magic vars 46400 1727204631.96374: starting attempt loop 46400 1727204631.96380: running the handler 46400 1727204631.96588: variable 'ansible_facts' from source: unknown 46400 1727204631.98204: _low_level_execute_command(): starting 46400 1727204631.98218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204632.00632: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.00702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.00721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.00741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.00792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.00912: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.00930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.00948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.00959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.00972: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.00984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.00996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.01015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.01031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.01042: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.01054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.01160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.01251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.01270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.01352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.03015: stdout chunk (state=3): >>>/root <<< 46400 1727204632.03216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204632.03220: stdout chunk (state=3): >>><<< 46400 1727204632.03222: stderr chunk (state=3): >>><<< 46400 1727204632.03271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204632.03274: _low_level_execute_command(): starting 46400 1727204632.03278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202 `" && echo ansible-tmp-1727204632.0324273-55005-210041611978202="` echo /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202 `" ) && sleep 0' 46400 1727204632.05140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.05155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.05174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.05194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.05350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.05363: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.05380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.05398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.05410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.05426: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.05445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.05460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.05480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.05493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.05503: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.05515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.05593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.05668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.05685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.05769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.07669: stdout chunk (state=3): >>>ansible-tmp-1727204632.0324273-55005-210041611978202=/root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202 <<< 46400 1727204632.07884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204632.07888: stdout chunk (state=3): >>><<< 46400 1727204632.07890: stderr chunk (state=3): >>><<< 46400 1727204632.08173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204632.0324273-55005-210041611978202=/root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204632.08183: variable 'ansible_module_compression' from source: unknown 46400 1727204632.08185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204632.08187: variable 'ansible_facts' from source: unknown 46400 1727204632.08249: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/AnsiballZ_systemd.py 46400 1727204632.08975: Sending initial data 46400 1727204632.08978: Sent initial data (156 bytes) 46400 1727204632.11834: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.11906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.11925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.11943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.11999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.12125: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.12142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.12165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.12180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.12192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.12204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.12223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.12244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.12263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.12279: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.12293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.12382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.12472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.12492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.12578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.14337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204632.14400: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204632.14434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp3vnnjo4h /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/AnsiballZ_systemd.py <<< 46400 1727204632.14777: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204632.17616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204632.17753: stderr chunk (state=3): >>><<< 46400 1727204632.17756: stdout chunk (state=3): >>><<< 46400 1727204632.17759: done transferring module to remote 46400 1727204632.17770: _low_level_execute_command(): starting 46400 1727204632.17773: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/ /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/AnsiballZ_systemd.py && sleep 0' 46400 1727204632.19278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.19302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.19318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.19335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.19386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.19404: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.19418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.19437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.19449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.19467: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.19482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.19496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.19519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.19533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.19543: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.19557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.19645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.19674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.19693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.19776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.21584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204632.21651: stderr chunk (state=3): >>><<< 46400 1727204632.21654: stdout chunk (state=3): >>><<< 46400 1727204632.21752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204632.21763: _low_level_execute_command(): starting 46400 1727204632.21768: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/AnsiballZ_systemd.py && sleep 0' 46400 1727204632.23129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.23139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.23223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.23252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.23287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.23290: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.23293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.23295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.23297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.23299: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.23304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.23306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.23308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.23310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.23363: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.23368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.23475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.23478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.23692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.23787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.48975: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 46400 1727204632.49047: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6942720", "MemoryAvailable": "infinity", "CPUUsageNSec": "2298753000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 46400 1727204632.49075: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204632.50636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204632.50693: stderr chunk (state=3): >>><<< 46400 1727204632.50696: stdout chunk (state=3): >>><<< 46400 1727204632.50713: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6942720", "MemoryAvailable": "infinity", "CPUUsageNSec": "2298753000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204632.50835: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204632.50850: _low_level_execute_command(): starting 46400 1727204632.50855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204632.0324273-55005-210041611978202/ > /dev/null 2>&1 && sleep 0' 46400 1727204632.52665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204632.52682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.52717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.52750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.52795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.52801: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204632.52811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.52827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204632.52837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204632.52844: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204632.52852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204632.52863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204632.52874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204632.52881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204632.52891: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204632.52897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204632.52972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204632.52991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204632.53003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204632.53094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204632.54875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204632.54960: stderr chunk (state=3): >>><<< 46400 1727204632.54973: stdout chunk (state=3): >>><<< 46400 1727204632.54992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204632.55001: handler run complete 46400 1727204632.55069: attempt loop complete, returning result 46400 1727204632.55073: _execute() done 46400 1727204632.55076: dumping result to json 46400 1727204632.55096: done dumping result, returning 46400 1727204632.55107: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-0000000024b0] 46400 1727204632.55113: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b0 46400 1727204632.55413: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b0 46400 1727204632.55416: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204632.55485: no more pending results, returning what we have 46400 1727204632.55491: results queue empty 46400 1727204632.55492: checking for any_errors_fatal 46400 1727204632.55499: done checking for any_errors_fatal 46400 1727204632.55500: checking for max_fail_percentage 46400 1727204632.55502: done checking for max_fail_percentage 46400 1727204632.55503: checking to see if all hosts have failed and the running result is not ok 46400 1727204632.55504: done checking to see if all hosts have failed 46400 1727204632.55505: getting the remaining hosts for this loop 46400 1727204632.55506: done getting the remaining hosts for this loop 46400 1727204632.55511: getting the next task for host managed-node2 46400 1727204632.55519: done getting next task for host managed-node2 46400 1727204632.55523: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204632.55529: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204632.55543: getting variables 46400 1727204632.55545: in VariableManager get_vars() 46400 1727204632.55590: Calling all_inventory to load vars for managed-node2 46400 1727204632.55593: Calling groups_inventory to load vars for managed-node2 46400 1727204632.55595: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204632.55605: Calling all_plugins_play to load vars for managed-node2 46400 1727204632.55608: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204632.55610: Calling groups_plugins_play to load vars for managed-node2 46400 1727204632.57797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204632.58969: done with get_vars() 46400 1727204632.59005: done getting variables 46400 1727204632.59081: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.806) 0:02:02.875 ***** 46400 1727204632.59120: entering _queue_task() for managed-node2/service 46400 1727204632.59526: worker is 1 (out of 1 available) 46400 1727204632.59540: exiting _queue_task() for managed-node2/service 46400 1727204632.59581: done queuing things up, now waiting for results queue to drain 46400 1727204632.59586: waiting for pending results... 46400 1727204632.59991: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204632.60176: in run() - task 0affcd87-79f5-1303-fda8-0000000024b1 46400 1727204632.60197: variable 'ansible_search_path' from source: unknown 46400 1727204632.60205: variable 'ansible_search_path' from source: unknown 46400 1727204632.60246: calling self._execute() 46400 1727204632.60357: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204632.60380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204632.60395: variable 'omit' from source: magic vars 46400 1727204632.60808: variable 'ansible_distribution_major_version' from source: facts 46400 1727204632.60826: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204632.60954: variable 'network_provider' from source: set_fact 46400 1727204632.60967: Evaluated conditional (network_provider == "nm"): True 46400 1727204632.61069: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204632.61174: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204632.61390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204632.65257: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204632.65342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204632.65388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204632.65432: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204632.65465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204632.65801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204632.65832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204632.65858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204632.65908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204632.65928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204632.65979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204632.66013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204632.66041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204632.66086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204632.66107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204632.66182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204632.66207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204632.66241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204632.66304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204632.66325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204632.66510: variable 'network_connections' from source: include params 46400 1727204632.66528: variable 'interface' from source: play vars 46400 1727204632.66613: variable 'interface' from source: play vars 46400 1727204632.66701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204632.66999: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204632.67042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204632.67134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204632.67205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204632.67330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204632.67367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204632.67406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204632.67446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204632.67504: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204632.67803: variable 'network_connections' from source: include params 46400 1727204632.67815: variable 'interface' from source: play vars 46400 1727204632.67894: variable 'interface' from source: play vars 46400 1727204632.67929: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204632.67937: when evaluation is False, skipping this task 46400 1727204632.67944: _execute() done 46400 1727204632.67951: dumping result to json 46400 1727204632.67958: done dumping result, returning 46400 1727204632.67977: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-0000000024b1] 46400 1727204632.68002: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b1 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204632.68162: no more pending results, returning what we have 46400 1727204632.68168: results queue empty 46400 1727204632.68170: checking for any_errors_fatal 46400 1727204632.68193: done checking for any_errors_fatal 46400 1727204632.68194: checking for max_fail_percentage 46400 1727204632.68197: done checking for max_fail_percentage 46400 1727204632.68198: checking to see if all hosts have failed and the running result is not ok 46400 1727204632.68199: done checking to see if all hosts have failed 46400 1727204632.68200: getting the remaining hosts for this loop 46400 1727204632.68202: done getting the remaining hosts for this loop 46400 1727204632.68207: getting the next task for host managed-node2 46400 1727204632.68217: done getting next task for host managed-node2 46400 1727204632.68222: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204632.68227: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204632.68260: getting variables 46400 1727204632.68262: in VariableManager get_vars() 46400 1727204632.68313: Calling all_inventory to load vars for managed-node2 46400 1727204632.68316: Calling groups_inventory to load vars for managed-node2 46400 1727204632.68318: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204632.68330: Calling all_plugins_play to load vars for managed-node2 46400 1727204632.68333: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204632.68336: Calling groups_plugins_play to load vars for managed-node2 46400 1727204632.69307: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b1 46400 1727204632.69310: WORKER PROCESS EXITING 46400 1727204632.70237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204632.73474: done with get_vars() 46400 1727204632.73502: done getting variables 46400 1727204632.73688: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.145) 0:02:03.021 ***** 46400 1727204632.73722: entering _queue_task() for managed-node2/service 46400 1727204632.74428: worker is 1 (out of 1 available) 46400 1727204632.74545: exiting _queue_task() for managed-node2/service 46400 1727204632.74563: done queuing things up, now waiting for results queue to drain 46400 1727204632.74566: waiting for pending results... 46400 1727204632.74893: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204632.75176: in run() - task 0affcd87-79f5-1303-fda8-0000000024b2 46400 1727204632.75250: variable 'ansible_search_path' from source: unknown 46400 1727204632.75326: variable 'ansible_search_path' from source: unknown 46400 1727204632.75378: calling self._execute() 46400 1727204632.75493: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204632.75506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204632.75520: variable 'omit' from source: magic vars 46400 1727204632.75917: variable 'ansible_distribution_major_version' from source: facts 46400 1727204632.75935: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204632.76068: variable 'network_provider' from source: set_fact 46400 1727204632.76079: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204632.76086: when evaluation is False, skipping this task 46400 1727204632.76098: _execute() done 46400 1727204632.76107: dumping result to json 46400 1727204632.76114: done dumping result, returning 46400 1727204632.76124: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-0000000024b2] 46400 1727204632.76135: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b2 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204632.76295: no more pending results, returning what we have 46400 1727204632.76300: results queue empty 46400 1727204632.76301: checking for any_errors_fatal 46400 1727204632.76314: done checking for any_errors_fatal 46400 1727204632.76314: checking for max_fail_percentage 46400 1727204632.76316: done checking for max_fail_percentage 46400 1727204632.76317: checking to see if all hosts have failed and the running result is not ok 46400 1727204632.76318: done checking to see if all hosts have failed 46400 1727204632.76319: getting the remaining hosts for this loop 46400 1727204632.76320: done getting the remaining hosts for this loop 46400 1727204632.76324: getting the next task for host managed-node2 46400 1727204632.76334: done getting next task for host managed-node2 46400 1727204632.76339: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204632.76345: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204632.76385: getting variables 46400 1727204632.76387: in VariableManager get_vars() 46400 1727204632.76442: Calling all_inventory to load vars for managed-node2 46400 1727204632.76445: Calling groups_inventory to load vars for managed-node2 46400 1727204632.76448: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204632.76462: Calling all_plugins_play to load vars for managed-node2 46400 1727204632.76467: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204632.76470: Calling groups_plugins_play to load vars for managed-node2 46400 1727204632.77416: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b2 46400 1727204632.77420: WORKER PROCESS EXITING 46400 1727204632.78547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204632.80292: done with get_vars() 46400 1727204632.80330: done getting variables 46400 1727204632.80400: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.067) 0:02:03.088 ***** 46400 1727204632.80442: entering _queue_task() for managed-node2/copy 46400 1727204632.80829: worker is 1 (out of 1 available) 46400 1727204632.80842: exiting _queue_task() for managed-node2/copy 46400 1727204632.80855: done queuing things up, now waiting for results queue to drain 46400 1727204632.80856: waiting for pending results... 46400 1727204632.81169: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204632.81313: in run() - task 0affcd87-79f5-1303-fda8-0000000024b3 46400 1727204632.81332: variable 'ansible_search_path' from source: unknown 46400 1727204632.81341: variable 'ansible_search_path' from source: unknown 46400 1727204632.81384: calling self._execute() 46400 1727204632.81512: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204632.81528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204632.81542: variable 'omit' from source: magic vars 46400 1727204632.81981: variable 'ansible_distribution_major_version' from source: facts 46400 1727204632.82005: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204632.82141: variable 'network_provider' from source: set_fact 46400 1727204632.82153: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204632.82161: when evaluation is False, skipping this task 46400 1727204632.82172: _execute() done 46400 1727204632.82180: dumping result to json 46400 1727204632.82187: done dumping result, returning 46400 1727204632.82199: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-0000000024b3] 46400 1727204632.82212: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b3 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204632.82387: no more pending results, returning what we have 46400 1727204632.82392: results queue empty 46400 1727204632.82393: checking for any_errors_fatal 46400 1727204632.82404: done checking for any_errors_fatal 46400 1727204632.82405: checking for max_fail_percentage 46400 1727204632.82408: done checking for max_fail_percentage 46400 1727204632.82409: checking to see if all hosts have failed and the running result is not ok 46400 1727204632.82410: done checking to see if all hosts have failed 46400 1727204632.82411: getting the remaining hosts for this loop 46400 1727204632.82413: done getting the remaining hosts for this loop 46400 1727204632.82418: getting the next task for host managed-node2 46400 1727204632.82430: done getting next task for host managed-node2 46400 1727204632.82436: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204632.82442: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204632.82482: getting variables 46400 1727204632.82484: in VariableManager get_vars() 46400 1727204632.82540: Calling all_inventory to load vars for managed-node2 46400 1727204632.82543: Calling groups_inventory to load vars for managed-node2 46400 1727204632.82546: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204632.82563: Calling all_plugins_play to load vars for managed-node2 46400 1727204632.82568: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204632.82572: Calling groups_plugins_play to load vars for managed-node2 46400 1727204632.83507: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b3 46400 1727204632.83511: WORKER PROCESS EXITING 46400 1727204632.84694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204632.86386: done with get_vars() 46400 1727204632.86424: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.060) 0:02:03.149 ***** 46400 1727204632.86520: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204632.86886: worker is 1 (out of 1 available) 46400 1727204632.86905: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204632.86921: done queuing things up, now waiting for results queue to drain 46400 1727204632.86923: waiting for pending results... 46400 1727204632.87257: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204632.87446: in run() - task 0affcd87-79f5-1303-fda8-0000000024b4 46400 1727204632.87473: variable 'ansible_search_path' from source: unknown 46400 1727204632.87486: variable 'ansible_search_path' from source: unknown 46400 1727204632.87528: calling self._execute() 46400 1727204632.87648: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204632.87668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204632.87684: variable 'omit' from source: magic vars 46400 1727204632.88096: variable 'ansible_distribution_major_version' from source: facts 46400 1727204632.88117: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204632.88131: variable 'omit' from source: magic vars 46400 1727204632.88207: variable 'omit' from source: magic vars 46400 1727204632.88399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204632.90835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204632.90924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204632.90980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204632.91020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204632.91056: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204632.91153: variable 'network_provider' from source: set_fact 46400 1727204632.91306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204632.91340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204632.91380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204632.91432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204632.91453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204632.91541: variable 'omit' from source: magic vars 46400 1727204632.91658: variable 'omit' from source: magic vars 46400 1727204632.91770: variable 'network_connections' from source: include params 46400 1727204632.91788: variable 'interface' from source: play vars 46400 1727204632.91861: variable 'interface' from source: play vars 46400 1727204632.92024: variable 'omit' from source: magic vars 46400 1727204632.92041: variable '__lsr_ansible_managed' from source: task vars 46400 1727204632.92113: variable '__lsr_ansible_managed' from source: task vars 46400 1727204632.92351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204632.92593: Loaded config def from plugin (lookup/template) 46400 1727204632.92604: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204632.92636: File lookup term: get_ansible_managed.j2 46400 1727204632.92643: variable 'ansible_search_path' from source: unknown 46400 1727204632.92655: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204632.92679: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204632.92707: variable 'ansible_search_path' from source: unknown 46400 1727204632.99441: variable 'ansible_managed' from source: unknown 46400 1727204632.99604: variable 'omit' from source: magic vars 46400 1727204632.99648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204632.99685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204632.99710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204632.99743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204632.99758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204632.99792: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204632.99801: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204632.99809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204632.99909: Set connection var ansible_shell_type to sh 46400 1727204632.99925: Set connection var ansible_shell_executable to /bin/sh 46400 1727204632.99942: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204632.99955: Set connection var ansible_connection to ssh 46400 1727204632.99967: Set connection var ansible_pipelining to False 46400 1727204632.99978: Set connection var ansible_timeout to 10 46400 1727204633.00009: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.00017: variable 'ansible_connection' from source: unknown 46400 1727204633.00023: variable 'ansible_module_compression' from source: unknown 46400 1727204633.00028: variable 'ansible_shell_type' from source: unknown 46400 1727204633.00034: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.00047: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.00059: variable 'ansible_pipelining' from source: unknown 46400 1727204633.00069: variable 'ansible_timeout' from source: unknown 46400 1727204633.00077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.00225: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204633.00249: variable 'omit' from source: magic vars 46400 1727204633.00260: starting attempt loop 46400 1727204633.00272: running the handler 46400 1727204633.00292: _low_level_execute_command(): starting 46400 1727204633.00304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204633.01036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.01080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.01084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.01087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.01122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.01144: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.01149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.01151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.01210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.01213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.01216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.01267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.02913: stdout chunk (state=3): >>>/root <<< 46400 1727204633.03018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.03092: stderr chunk (state=3): >>><<< 46400 1727204633.03096: stdout chunk (state=3): >>><<< 46400 1727204633.03107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.03119: _low_level_execute_command(): starting 46400 1727204633.03124: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140 `" && echo ansible-tmp-1727204633.0310872-55055-15429536954140="` echo /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140 `" ) && sleep 0' 46400 1727204633.03770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.03774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.03777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.03779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.03781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.03784: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204633.03786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.03820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204633.03824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204633.03826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204633.03828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.03831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.03852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.03855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.03858: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204633.03863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.03941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.03945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.03952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.04036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.05874: stdout chunk (state=3): >>>ansible-tmp-1727204633.0310872-55055-15429536954140=/root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140 <<< 46400 1727204633.05993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.06052: stderr chunk (state=3): >>><<< 46400 1727204633.06056: stdout chunk (state=3): >>><<< 46400 1727204633.06077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204633.0310872-55055-15429536954140=/root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.06122: variable 'ansible_module_compression' from source: unknown 46400 1727204633.06158: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204633.06205: variable 'ansible_facts' from source: unknown 46400 1727204633.06294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/AnsiballZ_network_connections.py 46400 1727204633.06407: Sending initial data 46400 1727204633.06412: Sent initial data (167 bytes) 46400 1727204633.07133: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.07137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.07187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.07230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.07234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.07236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.07317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.08954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204633.08987: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204633.09025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpr5tzwc5s /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/AnsiballZ_network_connections.py <<< 46400 1727204633.09052: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204633.10219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.10322: stderr chunk (state=3): >>><<< 46400 1727204633.10326: stdout chunk (state=3): >>><<< 46400 1727204633.10346: done transferring module to remote 46400 1727204633.10355: _low_level_execute_command(): starting 46400 1727204633.10362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/ /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/AnsiballZ_network_connections.py && sleep 0' 46400 1727204633.10841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.10849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.10857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.10872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.10905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.10912: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204633.10921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.10930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204633.10944: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.10948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.11033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.11110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.11178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.12909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.12930: stderr chunk (state=3): >>><<< 46400 1727204633.12933: stdout chunk (state=3): >>><<< 46400 1727204633.12948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.12955: _low_level_execute_command(): starting 46400 1727204633.12970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/AnsiballZ_network_connections.py && sleep 0' 46400 1727204633.13692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.13705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.13720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.13738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.13793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.13805: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204633.13819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.13837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204633.13849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204633.13872: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204633.13891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.13906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.13921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.13933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.13945: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204633.13958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.14045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.14067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.14085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.14174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.44200: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 46400 1727204633.44213: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kmpmhjr1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kmpmhjr1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/e77de78e-51aa-4006-a80a-c43a9ef40807: error=unknown <<< 46400 1727204633.44377: stdout chunk (state=3): >>> <<< 46400 1727204633.44380: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204633.46024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204633.46091: stderr chunk (state=3): >>><<< 46400 1727204633.46095: stdout chunk (state=3): >>><<< 46400 1727204633.46114: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kmpmhjr1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kmpmhjr1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/e77de78e-51aa-4006-a80a-c43a9ef40807: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204633.46143: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204633.46151: _low_level_execute_command(): starting 46400 1727204633.46156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204633.0310872-55055-15429536954140/ > /dev/null 2>&1 && sleep 0' 46400 1727204633.46645: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.46651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.46690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.46703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.46760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.46771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.46783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.46834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.48631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.48693: stderr chunk (state=3): >>><<< 46400 1727204633.48697: stdout chunk (state=3): >>><<< 46400 1727204633.48712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.48718: handler run complete 46400 1727204633.48737: attempt loop complete, returning result 46400 1727204633.48740: _execute() done 46400 1727204633.48744: dumping result to json 46400 1727204633.48746: done dumping result, returning 46400 1727204633.48757: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-0000000024b4] 46400 1727204633.48767: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b4 46400 1727204633.48868: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b4 46400 1727204633.48871: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 46400 1727204633.48991: no more pending results, returning what we have 46400 1727204633.48995: results queue empty 46400 1727204633.48996: checking for any_errors_fatal 46400 1727204633.49003: done checking for any_errors_fatal 46400 1727204633.49003: checking for max_fail_percentage 46400 1727204633.49005: done checking for max_fail_percentage 46400 1727204633.49006: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.49007: done checking to see if all hosts have failed 46400 1727204633.49007: getting the remaining hosts for this loop 46400 1727204633.49009: done getting the remaining hosts for this loop 46400 1727204633.49013: getting the next task for host managed-node2 46400 1727204633.49020: done getting next task for host managed-node2 46400 1727204633.49024: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204633.49029: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.49042: getting variables 46400 1727204633.49043: in VariableManager get_vars() 46400 1727204633.49098: Calling all_inventory to load vars for managed-node2 46400 1727204633.49100: Calling groups_inventory to load vars for managed-node2 46400 1727204633.49103: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.49113: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.49115: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.49118: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.50015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204633.50962: done with get_vars() 46400 1727204633.50984: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.645) 0:02:03.795 ***** 46400 1727204633.51052: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204633.51302: worker is 1 (out of 1 available) 46400 1727204633.51317: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204633.51331: done queuing things up, now waiting for results queue to drain 46400 1727204633.51333: waiting for pending results... 46400 1727204633.51535: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204633.51642: in run() - task 0affcd87-79f5-1303-fda8-0000000024b5 46400 1727204633.51655: variable 'ansible_search_path' from source: unknown 46400 1727204633.51659: variable 'ansible_search_path' from source: unknown 46400 1727204633.51695: calling self._execute() 46400 1727204633.51772: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.51778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.51788: variable 'omit' from source: magic vars 46400 1727204633.52080: variable 'ansible_distribution_major_version' from source: facts 46400 1727204633.52090: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204633.52182: variable 'network_state' from source: role '' defaults 46400 1727204633.52191: Evaluated conditional (network_state != {}): False 46400 1727204633.52194: when evaluation is False, skipping this task 46400 1727204633.52197: _execute() done 46400 1727204633.52200: dumping result to json 46400 1727204633.52203: done dumping result, returning 46400 1727204633.52208: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-0000000024b5] 46400 1727204633.52215: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b5 46400 1727204633.52310: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b5 46400 1727204633.52313: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204633.52382: no more pending results, returning what we have 46400 1727204633.52387: results queue empty 46400 1727204633.52388: checking for any_errors_fatal 46400 1727204633.52398: done checking for any_errors_fatal 46400 1727204633.52399: checking for max_fail_percentage 46400 1727204633.52401: done checking for max_fail_percentage 46400 1727204633.52402: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.52402: done checking to see if all hosts have failed 46400 1727204633.52403: getting the remaining hosts for this loop 46400 1727204633.52405: done getting the remaining hosts for this loop 46400 1727204633.52409: getting the next task for host managed-node2 46400 1727204633.52418: done getting next task for host managed-node2 46400 1727204633.52422: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204633.52427: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.52461: getting variables 46400 1727204633.52463: in VariableManager get_vars() 46400 1727204633.52502: Calling all_inventory to load vars for managed-node2 46400 1727204633.52505: Calling groups_inventory to load vars for managed-node2 46400 1727204633.52507: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.52516: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.52518: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.52520: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.53493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204633.54421: done with get_vars() 46400 1727204633.54440: done getting variables 46400 1727204633.54486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.034) 0:02:03.829 ***** 46400 1727204633.54516: entering _queue_task() for managed-node2/debug 46400 1727204633.54763: worker is 1 (out of 1 available) 46400 1727204633.54780: exiting _queue_task() for managed-node2/debug 46400 1727204633.54793: done queuing things up, now waiting for results queue to drain 46400 1727204633.54795: waiting for pending results... 46400 1727204633.55000: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204633.55114: in run() - task 0affcd87-79f5-1303-fda8-0000000024b6 46400 1727204633.55126: variable 'ansible_search_path' from source: unknown 46400 1727204633.55130: variable 'ansible_search_path' from source: unknown 46400 1727204633.55161: calling self._execute() 46400 1727204633.55237: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.55243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.55251: variable 'omit' from source: magic vars 46400 1727204633.55537: variable 'ansible_distribution_major_version' from source: facts 46400 1727204633.55547: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204633.55553: variable 'omit' from source: magic vars 46400 1727204633.55601: variable 'omit' from source: magic vars 46400 1727204633.55626: variable 'omit' from source: magic vars 46400 1727204633.55667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204633.55697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204633.55716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204633.55729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.55739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.55765: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204633.55770: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.55773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.55844: Set connection var ansible_shell_type to sh 46400 1727204633.55852: Set connection var ansible_shell_executable to /bin/sh 46400 1727204633.55857: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204633.55866: Set connection var ansible_connection to ssh 46400 1727204633.55872: Set connection var ansible_pipelining to False 46400 1727204633.55877: Set connection var ansible_timeout to 10 46400 1727204633.55896: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.55899: variable 'ansible_connection' from source: unknown 46400 1727204633.55903: variable 'ansible_module_compression' from source: unknown 46400 1727204633.55905: variable 'ansible_shell_type' from source: unknown 46400 1727204633.55908: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.55910: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.55912: variable 'ansible_pipelining' from source: unknown 46400 1727204633.55914: variable 'ansible_timeout' from source: unknown 46400 1727204633.55917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.56022: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204633.56036: variable 'omit' from source: magic vars 46400 1727204633.56042: starting attempt loop 46400 1727204633.56044: running the handler 46400 1727204633.56137: variable '__network_connections_result' from source: set_fact 46400 1727204633.56183: handler run complete 46400 1727204633.56196: attempt loop complete, returning result 46400 1727204633.56199: _execute() done 46400 1727204633.56201: dumping result to json 46400 1727204633.56203: done dumping result, returning 46400 1727204633.56211: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-0000000024b6] 46400 1727204633.56216: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b6 46400 1727204633.56301: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b6 46400 1727204633.56304: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 46400 1727204633.56387: no more pending results, returning what we have 46400 1727204633.56391: results queue empty 46400 1727204633.56393: checking for any_errors_fatal 46400 1727204633.56404: done checking for any_errors_fatal 46400 1727204633.56405: checking for max_fail_percentage 46400 1727204633.56407: done checking for max_fail_percentage 46400 1727204633.56408: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.56409: done checking to see if all hosts have failed 46400 1727204633.56409: getting the remaining hosts for this loop 46400 1727204633.56411: done getting the remaining hosts for this loop 46400 1727204633.56424: getting the next task for host managed-node2 46400 1727204633.56432: done getting next task for host managed-node2 46400 1727204633.56436: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204633.56442: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.56456: getting variables 46400 1727204633.56457: in VariableManager get_vars() 46400 1727204633.56499: Calling all_inventory to load vars for managed-node2 46400 1727204633.56502: Calling groups_inventory to load vars for managed-node2 46400 1727204633.56504: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.56513: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.56515: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.56518: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.57385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204633.58328: done with get_vars() 46400 1727204633.58346: done getting variables 46400 1727204633.58396: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.039) 0:02:03.868 ***** 46400 1727204633.58426: entering _queue_task() for managed-node2/debug 46400 1727204633.58678: worker is 1 (out of 1 available) 46400 1727204633.58692: exiting _queue_task() for managed-node2/debug 46400 1727204633.58705: done queuing things up, now waiting for results queue to drain 46400 1727204633.58706: waiting for pending results... 46400 1727204633.58911: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204633.59000: in run() - task 0affcd87-79f5-1303-fda8-0000000024b7 46400 1727204633.59013: variable 'ansible_search_path' from source: unknown 46400 1727204633.59016: variable 'ansible_search_path' from source: unknown 46400 1727204633.59051: calling self._execute() 46400 1727204633.59137: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.59142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.59146: variable 'omit' from source: magic vars 46400 1727204633.59434: variable 'ansible_distribution_major_version' from source: facts 46400 1727204633.59444: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204633.59449: variable 'omit' from source: magic vars 46400 1727204633.59505: variable 'omit' from source: magic vars 46400 1727204633.59528: variable 'omit' from source: magic vars 46400 1727204633.59564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204633.59597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204633.59614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204633.59627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.59636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.59661: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204633.59667: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.59671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.59740: Set connection var ansible_shell_type to sh 46400 1727204633.59748: Set connection var ansible_shell_executable to /bin/sh 46400 1727204633.59753: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204633.59758: Set connection var ansible_connection to ssh 46400 1727204633.59767: Set connection var ansible_pipelining to False 46400 1727204633.59772: Set connection var ansible_timeout to 10 46400 1727204633.59792: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.59796: variable 'ansible_connection' from source: unknown 46400 1727204633.59799: variable 'ansible_module_compression' from source: unknown 46400 1727204633.59802: variable 'ansible_shell_type' from source: unknown 46400 1727204633.59805: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.59807: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.59809: variable 'ansible_pipelining' from source: unknown 46400 1727204633.59811: variable 'ansible_timeout' from source: unknown 46400 1727204633.59813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.59919: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204633.59928: variable 'omit' from source: magic vars 46400 1727204633.59939: starting attempt loop 46400 1727204633.59942: running the handler 46400 1727204633.59981: variable '__network_connections_result' from source: set_fact 46400 1727204633.60045: variable '__network_connections_result' from source: set_fact 46400 1727204633.60126: handler run complete 46400 1727204633.60147: attempt loop complete, returning result 46400 1727204633.60150: _execute() done 46400 1727204633.60152: dumping result to json 46400 1727204633.60157: done dumping result, returning 46400 1727204633.60169: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-0000000024b7] 46400 1727204633.60175: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b7 46400 1727204633.60267: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b7 46400 1727204633.60270: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 46400 1727204633.60363: no more pending results, returning what we have 46400 1727204633.60369: results queue empty 46400 1727204633.60370: checking for any_errors_fatal 46400 1727204633.60378: done checking for any_errors_fatal 46400 1727204633.60378: checking for max_fail_percentage 46400 1727204633.60380: done checking for max_fail_percentage 46400 1727204633.60381: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.60382: done checking to see if all hosts have failed 46400 1727204633.60382: getting the remaining hosts for this loop 46400 1727204633.60384: done getting the remaining hosts for this loop 46400 1727204633.60388: getting the next task for host managed-node2 46400 1727204633.60396: done getting next task for host managed-node2 46400 1727204633.60400: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204633.60405: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.60417: getting variables 46400 1727204633.60419: in VariableManager get_vars() 46400 1727204633.60459: Calling all_inventory to load vars for managed-node2 46400 1727204633.60462: Calling groups_inventory to load vars for managed-node2 46400 1727204633.60468: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.60478: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.60481: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.60490: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.61428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204633.62360: done with get_vars() 46400 1727204633.62380: done getting variables 46400 1727204633.62427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.040) 0:02:03.909 ***** 46400 1727204633.62453: entering _queue_task() for managed-node2/debug 46400 1727204633.62702: worker is 1 (out of 1 available) 46400 1727204633.62716: exiting _queue_task() for managed-node2/debug 46400 1727204633.62729: done queuing things up, now waiting for results queue to drain 46400 1727204633.62731: waiting for pending results... 46400 1727204633.62932: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204633.63029: in run() - task 0affcd87-79f5-1303-fda8-0000000024b8 46400 1727204633.63041: variable 'ansible_search_path' from source: unknown 46400 1727204633.63044: variable 'ansible_search_path' from source: unknown 46400 1727204633.63080: calling self._execute() 46400 1727204633.63158: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.63161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.63176: variable 'omit' from source: magic vars 46400 1727204633.63452: variable 'ansible_distribution_major_version' from source: facts 46400 1727204633.63462: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204633.63549: variable 'network_state' from source: role '' defaults 46400 1727204633.63559: Evaluated conditional (network_state != {}): False 46400 1727204633.63562: when evaluation is False, skipping this task 46400 1727204633.63568: _execute() done 46400 1727204633.63571: dumping result to json 46400 1727204633.63573: done dumping result, returning 46400 1727204633.63580: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-0000000024b8] 46400 1727204633.63585: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b8 46400 1727204633.63683: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b8 46400 1727204633.63686: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204633.63731: no more pending results, returning what we have 46400 1727204633.63736: results queue empty 46400 1727204633.63737: checking for any_errors_fatal 46400 1727204633.63748: done checking for any_errors_fatal 46400 1727204633.63749: checking for max_fail_percentage 46400 1727204633.63751: done checking for max_fail_percentage 46400 1727204633.63752: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.63752: done checking to see if all hosts have failed 46400 1727204633.63753: getting the remaining hosts for this loop 46400 1727204633.63755: done getting the remaining hosts for this loop 46400 1727204633.63758: getting the next task for host managed-node2 46400 1727204633.63772: done getting next task for host managed-node2 46400 1727204633.63776: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204633.63781: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.63810: getting variables 46400 1727204633.63812: in VariableManager get_vars() 46400 1727204633.63852: Calling all_inventory to load vars for managed-node2 46400 1727204633.63855: Calling groups_inventory to load vars for managed-node2 46400 1727204633.63857: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.63872: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.63874: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.63877: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.64705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204633.65736: done with get_vars() 46400 1727204633.65752: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.033) 0:02:03.942 ***** 46400 1727204633.65829: entering _queue_task() for managed-node2/ping 46400 1727204633.66071: worker is 1 (out of 1 available) 46400 1727204633.66084: exiting _queue_task() for managed-node2/ping 46400 1727204633.66097: done queuing things up, now waiting for results queue to drain 46400 1727204633.66099: waiting for pending results... 46400 1727204633.66295: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204633.66380: in run() - task 0affcd87-79f5-1303-fda8-0000000024b9 46400 1727204633.66392: variable 'ansible_search_path' from source: unknown 46400 1727204633.66395: variable 'ansible_search_path' from source: unknown 46400 1727204633.66423: calling self._execute() 46400 1727204633.66505: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.66510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.66517: variable 'omit' from source: magic vars 46400 1727204633.66809: variable 'ansible_distribution_major_version' from source: facts 46400 1727204633.66818: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204633.66824: variable 'omit' from source: magic vars 46400 1727204633.66875: variable 'omit' from source: magic vars 46400 1727204633.66901: variable 'omit' from source: magic vars 46400 1727204633.66934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204633.66960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204633.66983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204633.66999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.67010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204633.67033: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204633.67036: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.67040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.67112: Set connection var ansible_shell_type to sh 46400 1727204633.67120: Set connection var ansible_shell_executable to /bin/sh 46400 1727204633.67125: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204633.67130: Set connection var ansible_connection to ssh 46400 1727204633.67135: Set connection var ansible_pipelining to False 46400 1727204633.67140: Set connection var ansible_timeout to 10 46400 1727204633.67158: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.67161: variable 'ansible_connection' from source: unknown 46400 1727204633.67166: variable 'ansible_module_compression' from source: unknown 46400 1727204633.67169: variable 'ansible_shell_type' from source: unknown 46400 1727204633.67171: variable 'ansible_shell_executable' from source: unknown 46400 1727204633.67175: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204633.67180: variable 'ansible_pipelining' from source: unknown 46400 1727204633.67182: variable 'ansible_timeout' from source: unknown 46400 1727204633.67186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204633.67342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204633.67351: variable 'omit' from source: magic vars 46400 1727204633.67354: starting attempt loop 46400 1727204633.67357: running the handler 46400 1727204633.67373: _low_level_execute_command(): starting 46400 1727204633.67379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204633.67923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.67939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.67955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204633.67983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.68021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.68033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.68096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.69750: stdout chunk (state=3): >>>/root <<< 46400 1727204633.69849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.69911: stderr chunk (state=3): >>><<< 46400 1727204633.69915: stdout chunk (state=3): >>><<< 46400 1727204633.69935: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.69948: _low_level_execute_command(): starting 46400 1727204633.69954: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542 `" && echo ansible-tmp-1727204633.6993616-55077-277116985107542="` echo /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542 `" ) && sleep 0' 46400 1727204633.70425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.70429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.70474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.70478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.70494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.70534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.70537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.70545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.70588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.72469: stdout chunk (state=3): >>>ansible-tmp-1727204633.6993616-55077-277116985107542=/root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542 <<< 46400 1727204633.72579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.72639: stderr chunk (state=3): >>><<< 46400 1727204633.72642: stdout chunk (state=3): >>><<< 46400 1727204633.72659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204633.6993616-55077-277116985107542=/root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.72704: variable 'ansible_module_compression' from source: unknown 46400 1727204633.72740: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204633.72769: variable 'ansible_facts' from source: unknown 46400 1727204633.72824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/AnsiballZ_ping.py 46400 1727204633.72936: Sending initial data 46400 1727204633.72946: Sent initial data (153 bytes) 46400 1727204633.73650: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.73654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.73692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.73695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.73697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.73755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.73758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.73768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.73803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.75523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204633.75557: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204633.75598: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpozhbss9b /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/AnsiballZ_ping.py <<< 46400 1727204633.75634: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204633.76386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.76506: stderr chunk (state=3): >>><<< 46400 1727204633.76511: stdout chunk (state=3): >>><<< 46400 1727204633.76530: done transferring module to remote 46400 1727204633.76538: _low_level_execute_command(): starting 46400 1727204633.76543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/ /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/AnsiballZ_ping.py && sleep 0' 46400 1727204633.77025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.77028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.77078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.77082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.77084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204633.77086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.77130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.77133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.77178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.79027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.79031: stdout chunk (state=3): >>><<< 46400 1727204633.79033: stderr chunk (state=3): >>><<< 46400 1727204633.79133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.79137: _low_level_execute_command(): starting 46400 1727204633.79139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/AnsiballZ_ping.py && sleep 0' 46400 1727204633.79909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.79925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.79949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.79969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.80013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.80031: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204633.80052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.80077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204633.80091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204633.80104: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204633.80116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.80137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.80159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.80187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204633.80201: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204633.80215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.80307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204633.80330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.80350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.80440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.93459: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204633.94488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204633.94593: stderr chunk (state=3): >>><<< 46400 1727204633.94596: stdout chunk (state=3): >>><<< 46400 1727204633.94736: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204633.94740: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204633.94743: _low_level_execute_command(): starting 46400 1727204633.94745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204633.6993616-55077-277116985107542/ > /dev/null 2>&1 && sleep 0' 46400 1727204633.95428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204633.95442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.95455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204633.95738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204633.95770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204633.95783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204633.95786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204633.95855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204633.95875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204633.95947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204633.97876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204633.97884: stdout chunk (state=3): >>><<< 46400 1727204633.97887: stderr chunk (state=3): >>><<< 46400 1727204633.97970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204633.97974: handler run complete 46400 1727204633.97977: attempt loop complete, returning result 46400 1727204633.97979: _execute() done 46400 1727204633.97981: dumping result to json 46400 1727204633.97984: done dumping result, returning 46400 1727204633.97986: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-0000000024b9] 46400 1727204633.97988: sending task result for task 0affcd87-79f5-1303-fda8-0000000024b9 46400 1727204633.98239: done sending task result for task 0affcd87-79f5-1303-fda8-0000000024b9 46400 1727204633.98243: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204633.98327: no more pending results, returning what we have 46400 1727204633.98331: results queue empty 46400 1727204633.98332: checking for any_errors_fatal 46400 1727204633.98337: done checking for any_errors_fatal 46400 1727204633.98338: checking for max_fail_percentage 46400 1727204633.98339: done checking for max_fail_percentage 46400 1727204633.98340: checking to see if all hosts have failed and the running result is not ok 46400 1727204633.98341: done checking to see if all hosts have failed 46400 1727204633.98342: getting the remaining hosts for this loop 46400 1727204633.98343: done getting the remaining hosts for this loop 46400 1727204633.98347: getting the next task for host managed-node2 46400 1727204633.98358: done getting next task for host managed-node2 46400 1727204633.98360: ^ task is: TASK: meta (role_complete) 46400 1727204633.98366: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204633.98381: getting variables 46400 1727204633.98382: in VariableManager get_vars() 46400 1727204633.98424: Calling all_inventory to load vars for managed-node2 46400 1727204633.98427: Calling groups_inventory to load vars for managed-node2 46400 1727204633.98429: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204633.98439: Calling all_plugins_play to load vars for managed-node2 46400 1727204633.98442: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204633.98446: Calling groups_plugins_play to load vars for managed-node2 46400 1727204633.99909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.01709: done with get_vars() 46400 1727204634.01749: done getting variables 46400 1727204634.01846: done queuing things up, now waiting for results queue to drain 46400 1727204634.01848: results queue empty 46400 1727204634.01849: checking for any_errors_fatal 46400 1727204634.01852: done checking for any_errors_fatal 46400 1727204634.01853: checking for max_fail_percentage 46400 1727204634.01854: done checking for max_fail_percentage 46400 1727204634.01855: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.01856: done checking to see if all hosts have failed 46400 1727204634.01857: getting the remaining hosts for this loop 46400 1727204634.01857: done getting the remaining hosts for this loop 46400 1727204634.01860: getting the next task for host managed-node2 46400 1727204634.01868: done getting next task for host managed-node2 46400 1727204634.01871: ^ task is: TASK: Test 46400 1727204634.01873: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.01877: getting variables 46400 1727204634.01878: in VariableManager get_vars() 46400 1727204634.01892: Calling all_inventory to load vars for managed-node2 46400 1727204634.01895: Calling groups_inventory to load vars for managed-node2 46400 1727204634.01897: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.01902: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.01905: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.01907: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.08938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.10168: done with get_vars() 46400 1727204634.10190: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.444) 0:02:04.387 ***** 46400 1727204634.10246: entering _queue_task() for managed-node2/include_tasks 46400 1727204634.10533: worker is 1 (out of 1 available) 46400 1727204634.10548: exiting _queue_task() for managed-node2/include_tasks 46400 1727204634.10561: done queuing things up, now waiting for results queue to drain 46400 1727204634.10563: waiting for pending results... 46400 1727204634.10769: running TaskExecutor() for managed-node2/TASK: Test 46400 1727204634.10861: in run() - task 0affcd87-79f5-1303-fda8-0000000020b1 46400 1727204634.10881: variable 'ansible_search_path' from source: unknown 46400 1727204634.10887: variable 'ansible_search_path' from source: unknown 46400 1727204634.10921: variable 'lsr_test' from source: include params 46400 1727204634.11095: variable 'lsr_test' from source: include params 46400 1727204634.11150: variable 'omit' from source: magic vars 46400 1727204634.11269: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.11282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.11290: variable 'omit' from source: magic vars 46400 1727204634.11468: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.11478: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.11484: variable 'item' from source: unknown 46400 1727204634.11531: variable 'item' from source: unknown 46400 1727204634.11562: variable 'item' from source: unknown 46400 1727204634.11608: variable 'item' from source: unknown 46400 1727204634.11747: dumping result to json 46400 1727204634.11750: done dumping result, returning 46400 1727204634.11752: done running TaskExecutor() for managed-node2/TASK: Test [0affcd87-79f5-1303-fda8-0000000020b1] 46400 1727204634.11754: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b1 46400 1727204634.11798: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b1 46400 1727204634.11801: WORKER PROCESS EXITING 46400 1727204634.11828: no more pending results, returning what we have 46400 1727204634.11833: in VariableManager get_vars() 46400 1727204634.11881: Calling all_inventory to load vars for managed-node2 46400 1727204634.11883: Calling groups_inventory to load vars for managed-node2 46400 1727204634.11887: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.11899: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.11902: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.11904: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.13121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.14786: done with get_vars() 46400 1727204634.14811: variable 'ansible_search_path' from source: unknown 46400 1727204634.14813: variable 'ansible_search_path' from source: unknown 46400 1727204634.14858: we have included files to process 46400 1727204634.14859: generating all_blocks data 46400 1727204634.14861: done generating all_blocks data 46400 1727204634.14869: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204634.14871: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204634.14873: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 46400 1727204634.15005: done processing included file 46400 1727204634.15007: iterating over new_blocks loaded from include file 46400 1727204634.15008: in VariableManager get_vars() 46400 1727204634.15028: done with get_vars() 46400 1727204634.15030: filtering new block on tags 46400 1727204634.15057: done filtering new block on tags 46400 1727204634.15059: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 46400 1727204634.15066: extending task lists for all hosts with included blocks 46400 1727204634.16046: done extending task lists 46400 1727204634.16047: done processing included files 46400 1727204634.16048: results queue empty 46400 1727204634.16049: checking for any_errors_fatal 46400 1727204634.16051: done checking for any_errors_fatal 46400 1727204634.16052: checking for max_fail_percentage 46400 1727204634.16053: done checking for max_fail_percentage 46400 1727204634.16054: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.16055: done checking to see if all hosts have failed 46400 1727204634.16055: getting the remaining hosts for this loop 46400 1727204634.16056: done getting the remaining hosts for this loop 46400 1727204634.16059: getting the next task for host managed-node2 46400 1727204634.16063: done getting next task for host managed-node2 46400 1727204634.16066: ^ task is: TASK: Include network role 46400 1727204634.16070: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.16072: getting variables 46400 1727204634.16073: in VariableManager get_vars() 46400 1727204634.16087: Calling all_inventory to load vars for managed-node2 46400 1727204634.16089: Calling groups_inventory to load vars for managed-node2 46400 1727204634.16091: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.16098: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.16100: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.16103: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.17411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.19033: done with get_vars() 46400 1727204634.19058: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.089) 0:02:04.476 ***** 46400 1727204634.19152: entering _queue_task() for managed-node2/include_role 46400 1727204634.19511: worker is 1 (out of 1 available) 46400 1727204634.19525: exiting _queue_task() for managed-node2/include_role 46400 1727204634.19539: done queuing things up, now waiting for results queue to drain 46400 1727204634.19541: waiting for pending results... 46400 1727204634.19852: running TaskExecutor() for managed-node2/TASK: Include network role 46400 1727204634.19983: in run() - task 0affcd87-79f5-1303-fda8-000000002612 46400 1727204634.20010: variable 'ansible_search_path' from source: unknown 46400 1727204634.20014: variable 'ansible_search_path' from source: unknown 46400 1727204634.20052: calling self._execute() 46400 1727204634.20159: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.20169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.20180: variable 'omit' from source: magic vars 46400 1727204634.20646: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.20662: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.20673: _execute() done 46400 1727204634.20676: dumping result to json 46400 1727204634.20679: done dumping result, returning 46400 1727204634.20685: done running TaskExecutor() for managed-node2/TASK: Include network role [0affcd87-79f5-1303-fda8-000000002612] 46400 1727204634.20692: sending task result for task 0affcd87-79f5-1303-fda8-000000002612 46400 1727204634.20808: done sending task result for task 0affcd87-79f5-1303-fda8-000000002612 46400 1727204634.20811: WORKER PROCESS EXITING 46400 1727204634.20836: no more pending results, returning what we have 46400 1727204634.20842: in VariableManager get_vars() 46400 1727204634.20897: Calling all_inventory to load vars for managed-node2 46400 1727204634.20900: Calling groups_inventory to load vars for managed-node2 46400 1727204634.20904: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.20917: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.20920: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.20922: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.22875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.24761: done with get_vars() 46400 1727204634.24784: variable 'ansible_search_path' from source: unknown 46400 1727204634.24785: variable 'ansible_search_path' from source: unknown 46400 1727204634.24929: variable 'omit' from source: magic vars 46400 1727204634.24973: variable 'omit' from source: magic vars 46400 1727204634.24990: variable 'omit' from source: magic vars 46400 1727204634.24994: we have included files to process 46400 1727204634.24995: generating all_blocks data 46400 1727204634.24997: done generating all_blocks data 46400 1727204634.24998: processing included file: fedora.linux_system_roles.network 46400 1727204634.25020: in VariableManager get_vars() 46400 1727204634.25038: done with get_vars() 46400 1727204634.25068: in VariableManager get_vars() 46400 1727204634.25089: done with get_vars() 46400 1727204634.25126: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 46400 1727204634.25250: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 46400 1727204634.25333: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 46400 1727204634.25857: in VariableManager get_vars() 46400 1727204634.25881: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204634.28500: iterating over new_blocks loaded from include file 46400 1727204634.28503: in VariableManager get_vars() 46400 1727204634.28525: done with get_vars() 46400 1727204634.28527: filtering new block on tags 46400 1727204634.28821: done filtering new block on tags 46400 1727204634.28825: in VariableManager get_vars() 46400 1727204634.28846: done with get_vars() 46400 1727204634.28848: filtering new block on tags 46400 1727204634.28866: done filtering new block on tags 46400 1727204634.28868: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 46400 1727204634.28874: extending task lists for all hosts with included blocks 46400 1727204634.28985: done extending task lists 46400 1727204634.28987: done processing included files 46400 1727204634.28987: results queue empty 46400 1727204634.28988: checking for any_errors_fatal 46400 1727204634.28992: done checking for any_errors_fatal 46400 1727204634.28993: checking for max_fail_percentage 46400 1727204634.28994: done checking for max_fail_percentage 46400 1727204634.28994: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.28995: done checking to see if all hosts have failed 46400 1727204634.28996: getting the remaining hosts for this loop 46400 1727204634.28997: done getting the remaining hosts for this loop 46400 1727204634.28999: getting the next task for host managed-node2 46400 1727204634.29004: done getting next task for host managed-node2 46400 1727204634.29006: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204634.29009: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.29021: getting variables 46400 1727204634.29022: in VariableManager get_vars() 46400 1727204634.29037: Calling all_inventory to load vars for managed-node2 46400 1727204634.29039: Calling groups_inventory to load vars for managed-node2 46400 1727204634.29041: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.29047: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.29049: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.29051: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.30905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.32634: done with get_vars() 46400 1727204634.32669: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.135) 0:02:04.612 ***** 46400 1727204634.32755: entering _queue_task() for managed-node2/include_tasks 46400 1727204634.33120: worker is 1 (out of 1 available) 46400 1727204634.33134: exiting _queue_task() for managed-node2/include_tasks 46400 1727204634.33145: done queuing things up, now waiting for results queue to drain 46400 1727204634.33147: waiting for pending results... 46400 1727204634.33594: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 46400 1727204634.33737: in run() - task 0affcd87-79f5-1303-fda8-000000002694 46400 1727204634.33762: variable 'ansible_search_path' from source: unknown 46400 1727204634.33773: variable 'ansible_search_path' from source: unknown 46400 1727204634.33814: calling self._execute() 46400 1727204634.33939: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.33950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.33969: variable 'omit' from source: magic vars 46400 1727204634.34351: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.34371: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.34383: _execute() done 46400 1727204634.34390: dumping result to json 46400 1727204634.34401: done dumping result, returning 46400 1727204634.34411: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1303-fda8-000000002694] 46400 1727204634.34422: sending task result for task 0affcd87-79f5-1303-fda8-000000002694 46400 1727204634.34584: no more pending results, returning what we have 46400 1727204634.34590: in VariableManager get_vars() 46400 1727204634.34650: Calling all_inventory to load vars for managed-node2 46400 1727204634.34656: Calling groups_inventory to load vars for managed-node2 46400 1727204634.34659: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.34674: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.34677: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.34680: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.35789: done sending task result for task 0affcd87-79f5-1303-fda8-000000002694 46400 1727204634.35792: WORKER PROCESS EXITING 46400 1727204634.36631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.38710: done with get_vars() 46400 1727204634.38745: variable 'ansible_search_path' from source: unknown 46400 1727204634.38747: variable 'ansible_search_path' from source: unknown 46400 1727204634.38795: we have included files to process 46400 1727204634.38796: generating all_blocks data 46400 1727204634.38798: done generating all_blocks data 46400 1727204634.38802: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204634.38803: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204634.38805: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 46400 1727204634.39448: done processing included file 46400 1727204634.39450: iterating over new_blocks loaded from include file 46400 1727204634.39455: in VariableManager get_vars() 46400 1727204634.39487: done with get_vars() 46400 1727204634.39489: filtering new block on tags 46400 1727204634.39520: done filtering new block on tags 46400 1727204634.39522: in VariableManager get_vars() 46400 1727204634.39539: done with get_vars() 46400 1727204634.39540: filtering new block on tags 46400 1727204634.39572: done filtering new block on tags 46400 1727204634.39574: in VariableManager get_vars() 46400 1727204634.39590: done with get_vars() 46400 1727204634.39591: filtering new block on tags 46400 1727204634.39624: done filtering new block on tags 46400 1727204634.39626: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 46400 1727204634.39630: extending task lists for all hosts with included blocks 46400 1727204634.41196: done extending task lists 46400 1727204634.41197: done processing included files 46400 1727204634.41198: results queue empty 46400 1727204634.41199: checking for any_errors_fatal 46400 1727204634.41202: done checking for any_errors_fatal 46400 1727204634.41203: checking for max_fail_percentage 46400 1727204634.41204: done checking for max_fail_percentage 46400 1727204634.41205: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.41206: done checking to see if all hosts have failed 46400 1727204634.41206: getting the remaining hosts for this loop 46400 1727204634.41208: done getting the remaining hosts for this loop 46400 1727204634.41210: getting the next task for host managed-node2 46400 1727204634.41215: done getting next task for host managed-node2 46400 1727204634.41217: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204634.41222: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.41234: getting variables 46400 1727204634.41235: in VariableManager get_vars() 46400 1727204634.41254: Calling all_inventory to load vars for managed-node2 46400 1727204634.41257: Calling groups_inventory to load vars for managed-node2 46400 1727204634.41259: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.41266: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.41269: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.41272: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.43126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.44699: done with get_vars() 46400 1727204634.44719: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.120) 0:02:04.732 ***** 46400 1727204634.44784: entering _queue_task() for managed-node2/setup 46400 1727204634.45048: worker is 1 (out of 1 available) 46400 1727204634.45061: exiting _queue_task() for managed-node2/setup 46400 1727204634.45076: done queuing things up, now waiting for results queue to drain 46400 1727204634.45077: waiting for pending results... 46400 1727204634.45386: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 46400 1727204634.45508: in run() - task 0affcd87-79f5-1303-fda8-0000000026eb 46400 1727204634.45529: variable 'ansible_search_path' from source: unknown 46400 1727204634.45534: variable 'ansible_search_path' from source: unknown 46400 1727204634.45570: calling self._execute() 46400 1727204634.45675: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.45681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.45691: variable 'omit' from source: magic vars 46400 1727204634.46071: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.46085: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.46378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204634.48519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204634.48572: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204634.48602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204634.48628: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204634.48650: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204634.48717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204634.48736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204634.48756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204634.48789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204634.48801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204634.48841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204634.48859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204634.48881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204634.48906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204634.48918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204634.49041: variable '__network_required_facts' from source: role '' defaults 46400 1727204634.49050: variable 'ansible_facts' from source: unknown 46400 1727204634.49553: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 46400 1727204634.49557: when evaluation is False, skipping this task 46400 1727204634.49560: _execute() done 46400 1727204634.49564: dumping result to json 46400 1727204634.49566: done dumping result, returning 46400 1727204634.49579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1303-fda8-0000000026eb] 46400 1727204634.49582: sending task result for task 0affcd87-79f5-1303-fda8-0000000026eb 46400 1727204634.49672: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026eb 46400 1727204634.49676: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204634.49727: no more pending results, returning what we have 46400 1727204634.49731: results queue empty 46400 1727204634.49732: checking for any_errors_fatal 46400 1727204634.49734: done checking for any_errors_fatal 46400 1727204634.49734: checking for max_fail_percentage 46400 1727204634.49736: done checking for max_fail_percentage 46400 1727204634.49737: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.49738: done checking to see if all hosts have failed 46400 1727204634.49738: getting the remaining hosts for this loop 46400 1727204634.49740: done getting the remaining hosts for this loop 46400 1727204634.49744: getting the next task for host managed-node2 46400 1727204634.49756: done getting next task for host managed-node2 46400 1727204634.49761: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204634.49769: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.49802: getting variables 46400 1727204634.49804: in VariableManager get_vars() 46400 1727204634.49848: Calling all_inventory to load vars for managed-node2 46400 1727204634.49851: Calling groups_inventory to load vars for managed-node2 46400 1727204634.49853: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.49865: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.49867: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.49876: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.50766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.51720: done with get_vars() 46400 1727204634.51737: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.070) 0:02:04.802 ***** 46400 1727204634.51825: entering _queue_task() for managed-node2/stat 46400 1727204634.52074: worker is 1 (out of 1 available) 46400 1727204634.52088: exiting _queue_task() for managed-node2/stat 46400 1727204634.52103: done queuing things up, now waiting for results queue to drain 46400 1727204634.52104: waiting for pending results... 46400 1727204634.52307: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 46400 1727204634.52414: in run() - task 0affcd87-79f5-1303-fda8-0000000026ed 46400 1727204634.52426: variable 'ansible_search_path' from source: unknown 46400 1727204634.52429: variable 'ansible_search_path' from source: unknown 46400 1727204634.52461: calling self._execute() 46400 1727204634.52539: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.52544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.52553: variable 'omit' from source: magic vars 46400 1727204634.52835: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.52846: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.52967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204634.53173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204634.53207: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204634.53236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204634.53259: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204634.53327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204634.53348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204634.53370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204634.53388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204634.53461: variable '__network_is_ostree' from source: set_fact 46400 1727204634.53470: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204634.53473: when evaluation is False, skipping this task 46400 1727204634.53476: _execute() done 46400 1727204634.53479: dumping result to json 46400 1727204634.53481: done dumping result, returning 46400 1727204634.53487: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1303-fda8-0000000026ed] 46400 1727204634.53493: sending task result for task 0affcd87-79f5-1303-fda8-0000000026ed 46400 1727204634.53587: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026ed 46400 1727204634.53590: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204634.53643: no more pending results, returning what we have 46400 1727204634.53647: results queue empty 46400 1727204634.53648: checking for any_errors_fatal 46400 1727204634.53658: done checking for any_errors_fatal 46400 1727204634.53659: checking for max_fail_percentage 46400 1727204634.53660: done checking for max_fail_percentage 46400 1727204634.53661: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.53662: done checking to see if all hosts have failed 46400 1727204634.53663: getting the remaining hosts for this loop 46400 1727204634.53666: done getting the remaining hosts for this loop 46400 1727204634.53671: getting the next task for host managed-node2 46400 1727204634.53679: done getting next task for host managed-node2 46400 1727204634.53683: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204634.53688: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.53723: getting variables 46400 1727204634.53724: in VariableManager get_vars() 46400 1727204634.53768: Calling all_inventory to load vars for managed-node2 46400 1727204634.53771: Calling groups_inventory to load vars for managed-node2 46400 1727204634.53773: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.53782: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.53785: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.53787: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.54805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.55739: done with get_vars() 46400 1727204634.55761: done getting variables 46400 1727204634.55808: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.040) 0:02:04.842 ***** 46400 1727204634.55836: entering _queue_task() for managed-node2/set_fact 46400 1727204634.56093: worker is 1 (out of 1 available) 46400 1727204634.56108: exiting _queue_task() for managed-node2/set_fact 46400 1727204634.56123: done queuing things up, now waiting for results queue to drain 46400 1727204634.56124: waiting for pending results... 46400 1727204634.56321: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 46400 1727204634.56440: in run() - task 0affcd87-79f5-1303-fda8-0000000026ee 46400 1727204634.56454: variable 'ansible_search_path' from source: unknown 46400 1727204634.56457: variable 'ansible_search_path' from source: unknown 46400 1727204634.56490: calling self._execute() 46400 1727204634.56572: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.56577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.56586: variable 'omit' from source: magic vars 46400 1727204634.56873: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.56882: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.57004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204634.57210: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204634.57245: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204634.57274: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204634.57300: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204634.57369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204634.57387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204634.57408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204634.57428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204634.57507: variable '__network_is_ostree' from source: set_fact 46400 1727204634.57511: Evaluated conditional (not __network_is_ostree is defined): False 46400 1727204634.57515: when evaluation is False, skipping this task 46400 1727204634.57517: _execute() done 46400 1727204634.57520: dumping result to json 46400 1727204634.57523: done dumping result, returning 46400 1727204634.57532: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1303-fda8-0000000026ee] 46400 1727204634.57534: sending task result for task 0affcd87-79f5-1303-fda8-0000000026ee 46400 1727204634.57619: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026ee 46400 1727204634.57622: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 46400 1727204634.57700: no more pending results, returning what we have 46400 1727204634.57704: results queue empty 46400 1727204634.57705: checking for any_errors_fatal 46400 1727204634.57712: done checking for any_errors_fatal 46400 1727204634.57713: checking for max_fail_percentage 46400 1727204634.57715: done checking for max_fail_percentage 46400 1727204634.57716: checking to see if all hosts have failed and the running result is not ok 46400 1727204634.57716: done checking to see if all hosts have failed 46400 1727204634.57717: getting the remaining hosts for this loop 46400 1727204634.57719: done getting the remaining hosts for this loop 46400 1727204634.57723: getting the next task for host managed-node2 46400 1727204634.57739: done getting next task for host managed-node2 46400 1727204634.57748: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204634.57754: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204634.57781: getting variables 46400 1727204634.57783: in VariableManager get_vars() 46400 1727204634.57823: Calling all_inventory to load vars for managed-node2 46400 1727204634.57825: Calling groups_inventory to load vars for managed-node2 46400 1727204634.57827: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204634.57836: Calling all_plugins_play to load vars for managed-node2 46400 1727204634.57839: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204634.57846: Calling groups_plugins_play to load vars for managed-node2 46400 1727204634.58700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204634.59660: done with get_vars() 46400 1727204634.59684: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:54 -0400 (0:00:00.039) 0:02:04.882 ***** 46400 1727204634.59760: entering _queue_task() for managed-node2/service_facts 46400 1727204634.60014: worker is 1 (out of 1 available) 46400 1727204634.60029: exiting _queue_task() for managed-node2/service_facts 46400 1727204634.60043: done queuing things up, now waiting for results queue to drain 46400 1727204634.60045: waiting for pending results... 46400 1727204634.60251: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 46400 1727204634.60345: in run() - task 0affcd87-79f5-1303-fda8-0000000026f0 46400 1727204634.60356: variable 'ansible_search_path' from source: unknown 46400 1727204634.60360: variable 'ansible_search_path' from source: unknown 46400 1727204634.60392: calling self._execute() 46400 1727204634.60474: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.60479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.60489: variable 'omit' from source: magic vars 46400 1727204634.60772: variable 'ansible_distribution_major_version' from source: facts 46400 1727204634.60784: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204634.60789: variable 'omit' from source: magic vars 46400 1727204634.60848: variable 'omit' from source: magic vars 46400 1727204634.60877: variable 'omit' from source: magic vars 46400 1727204634.60913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204634.60943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204634.60961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204634.60978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204634.60989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204634.61013: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204634.61017: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.61019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.61089: Set connection var ansible_shell_type to sh 46400 1727204634.61097: Set connection var ansible_shell_executable to /bin/sh 46400 1727204634.61101: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204634.61107: Set connection var ansible_connection to ssh 46400 1727204634.61114: Set connection var ansible_pipelining to False 46400 1727204634.61116: Set connection var ansible_timeout to 10 46400 1727204634.61137: variable 'ansible_shell_executable' from source: unknown 46400 1727204634.61140: variable 'ansible_connection' from source: unknown 46400 1727204634.61144: variable 'ansible_module_compression' from source: unknown 46400 1727204634.61147: variable 'ansible_shell_type' from source: unknown 46400 1727204634.61149: variable 'ansible_shell_executable' from source: unknown 46400 1727204634.61151: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204634.61153: variable 'ansible_pipelining' from source: unknown 46400 1727204634.61155: variable 'ansible_timeout' from source: unknown 46400 1727204634.61157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204634.61311: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204634.61321: variable 'omit' from source: magic vars 46400 1727204634.61324: starting attempt loop 46400 1727204634.61327: running the handler 46400 1727204634.61339: _low_level_execute_command(): starting 46400 1727204634.61346: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204634.61895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.61899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.61928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.61932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.61934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.61994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204634.61997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204634.62000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204634.62050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204634.63685: stdout chunk (state=3): >>>/root <<< 46400 1727204634.63791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204634.63852: stderr chunk (state=3): >>><<< 46400 1727204634.63856: stdout chunk (state=3): >>><<< 46400 1727204634.63880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204634.63893: _low_level_execute_command(): starting 46400 1727204634.63899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015 `" && echo ansible-tmp-1727204634.6388092-55116-111952026516015="` echo /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015 `" ) && sleep 0' 46400 1727204634.64382: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.64385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.64416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204634.64421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.64432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.64479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204634.64485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204634.64537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204634.66386: stdout chunk (state=3): >>>ansible-tmp-1727204634.6388092-55116-111952026516015=/root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015 <<< 46400 1727204634.66501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204634.66560: stderr chunk (state=3): >>><<< 46400 1727204634.66563: stdout chunk (state=3): >>><<< 46400 1727204634.66583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204634.6388092-55116-111952026516015=/root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204634.66624: variable 'ansible_module_compression' from source: unknown 46400 1727204634.66668: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 46400 1727204634.66699: variable 'ansible_facts' from source: unknown 46400 1727204634.66758: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/AnsiballZ_service_facts.py 46400 1727204634.66872: Sending initial data 46400 1727204634.66875: Sent initial data (162 bytes) 46400 1727204634.67573: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.67579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.67623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.67628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204634.67630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.67694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204634.67698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204634.67745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204634.69471: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204634.69499: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204634.69536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp2d358_6n /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/AnsiballZ_service_facts.py <<< 46400 1727204634.69576: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204634.70394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204634.70511: stderr chunk (state=3): >>><<< 46400 1727204634.70514: stdout chunk (state=3): >>><<< 46400 1727204634.70530: done transferring module to remote 46400 1727204634.70542: _low_level_execute_command(): starting 46400 1727204634.70546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/ /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/AnsiballZ_service_facts.py && sleep 0' 46400 1727204634.71293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204634.71314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.71335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204634.71354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.71404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204634.71425: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204634.71447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.71473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204634.71488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204634.71502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204634.71521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.71545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204634.71569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.71583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204634.71593: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204634.71606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.71703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204634.71725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204634.71744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204634.71824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204634.73523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204634.73586: stderr chunk (state=3): >>><<< 46400 1727204634.73589: stdout chunk (state=3): >>><<< 46400 1727204634.73606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204634.73610: _low_level_execute_command(): starting 46400 1727204634.73614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/AnsiballZ_service_facts.py && sleep 0' 46400 1727204634.74051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204634.74057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204634.74103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.74106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204634.74108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204634.74174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204634.74178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204634.74232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.02878: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state<<< 46400 1727204636.02892: stdout chunk (state=3): >>>": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syste<<< 46400 1727204636.02895: stdout chunk (state=3): >>>md-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.se<<< 46400 1727204636.02900: stdout chunk (state=3): >>>rvice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibern<<< 46400 1727204636.02957: stdout chunk (state=3): >>>ate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 46400 1727204636.04077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204636.04172: stderr chunk (state=3): >>><<< 46400 1727204636.04176: stdout chunk (state=3): >>><<< 46400 1727204636.04182: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204636.05507: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204636.05526: _low_level_execute_command(): starting 46400 1727204636.05537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204634.6388092-55116-111952026516015/ > /dev/null 2>&1 && sleep 0' 46400 1727204636.06261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204636.06279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.06304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204636.06322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.06369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204636.06384: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204636.06405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.06424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204636.06436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204636.06450: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204636.06472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.06487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204636.06504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.06529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204636.06543: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204636.06559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.06647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.06665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204636.06682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.06750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.08517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.08565: stderr chunk (state=3): >>><<< 46400 1727204636.08570: stdout chunk (state=3): >>><<< 46400 1727204636.08587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204636.08592: handler run complete 46400 1727204636.08700: variable 'ansible_facts' from source: unknown 46400 1727204636.08793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.09049: variable 'ansible_facts' from source: unknown 46400 1727204636.09127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.09258: attempt loop complete, returning result 46400 1727204636.09267: _execute() done 46400 1727204636.09270: dumping result to json 46400 1727204636.09302: done dumping result, returning 46400 1727204636.09311: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1303-fda8-0000000026f0] 46400 1727204636.09316: sending task result for task 0affcd87-79f5-1303-fda8-0000000026f0 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204636.10886: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026f0 46400 1727204636.10889: WORKER PROCESS EXITING 46400 1727204636.10915: no more pending results, returning what we have 46400 1727204636.10918: results queue empty 46400 1727204636.10919: checking for any_errors_fatal 46400 1727204636.10922: done checking for any_errors_fatal 46400 1727204636.10922: checking for max_fail_percentage 46400 1727204636.10923: done checking for max_fail_percentage 46400 1727204636.10924: checking to see if all hosts have failed and the running result is not ok 46400 1727204636.10924: done checking to see if all hosts have failed 46400 1727204636.10925: getting the remaining hosts for this loop 46400 1727204636.10926: done getting the remaining hosts for this loop 46400 1727204636.10928: getting the next task for host managed-node2 46400 1727204636.10933: done getting next task for host managed-node2 46400 1727204636.10936: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204636.10941: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204636.10959: getting variables 46400 1727204636.10960: in VariableManager get_vars() 46400 1727204636.10993: Calling all_inventory to load vars for managed-node2 46400 1727204636.10995: Calling groups_inventory to load vars for managed-node2 46400 1727204636.10997: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204636.11004: Calling all_plugins_play to load vars for managed-node2 46400 1727204636.11006: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204636.11008: Calling groups_plugins_play to load vars for managed-node2 46400 1727204636.12356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.13651: done with get_vars() 46400 1727204636.13676: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:56 -0400 (0:00:01.539) 0:02:06.422 ***** 46400 1727204636.13750: entering _queue_task() for managed-node2/package_facts 46400 1727204636.14006: worker is 1 (out of 1 available) 46400 1727204636.14023: exiting _queue_task() for managed-node2/package_facts 46400 1727204636.14040: done queuing things up, now waiting for results queue to drain 46400 1727204636.14041: waiting for pending results... 46400 1727204636.14232: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 46400 1727204636.14349: in run() - task 0affcd87-79f5-1303-fda8-0000000026f1 46400 1727204636.14363: variable 'ansible_search_path' from source: unknown 46400 1727204636.14370: variable 'ansible_search_path' from source: unknown 46400 1727204636.14400: calling self._execute() 46400 1727204636.14474: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.14479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.14490: variable 'omit' from source: magic vars 46400 1727204636.14808: variable 'ansible_distribution_major_version' from source: facts 46400 1727204636.14826: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204636.14839: variable 'omit' from source: magic vars 46400 1727204636.14938: variable 'omit' from source: magic vars 46400 1727204636.14994: variable 'omit' from source: magic vars 46400 1727204636.15044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204636.15103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204636.15132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204636.15156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204636.15184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204636.15221: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204636.15229: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.15236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.15346: Set connection var ansible_shell_type to sh 46400 1727204636.15366: Set connection var ansible_shell_executable to /bin/sh 46400 1727204636.15377: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204636.15386: Set connection var ansible_connection to ssh 46400 1727204636.15401: Set connection var ansible_pipelining to False 46400 1727204636.15414: Set connection var ansible_timeout to 10 46400 1727204636.15441: variable 'ansible_shell_executable' from source: unknown 46400 1727204636.15449: variable 'ansible_connection' from source: unknown 46400 1727204636.15456: variable 'ansible_module_compression' from source: unknown 46400 1727204636.15467: variable 'ansible_shell_type' from source: unknown 46400 1727204636.15475: variable 'ansible_shell_executable' from source: unknown 46400 1727204636.15481: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.15488: variable 'ansible_pipelining' from source: unknown 46400 1727204636.15495: variable 'ansible_timeout' from source: unknown 46400 1727204636.15509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.15756: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204636.15778: variable 'omit' from source: magic vars 46400 1727204636.15788: starting attempt loop 46400 1727204636.15794: running the handler 46400 1727204636.15811: _low_level_execute_command(): starting 46400 1727204636.15821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204636.16392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204636.16401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204636.16413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.16457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.16463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204636.16469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.16509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.16518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204636.16525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.16583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.18143: stdout chunk (state=3): >>>/root <<< 46400 1727204636.18245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.18304: stderr chunk (state=3): >>><<< 46400 1727204636.18307: stdout chunk (state=3): >>><<< 46400 1727204636.18334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204636.18344: _low_level_execute_command(): starting 46400 1727204636.18350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229 `" && echo ansible-tmp-1727204636.1833222-55347-258055529467229="` echo /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229 `" ) && sleep 0' 46400 1727204636.18825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.18830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.18878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204636.18890: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204636.18893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.19044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204636.19048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204636.19050: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204636.19052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.19054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204636.19056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.19058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204636.19062: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204636.19066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.19068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.19070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204636.19071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.19135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.20982: stdout chunk (state=3): >>>ansible-tmp-1727204636.1833222-55347-258055529467229=/root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229 <<< 46400 1727204636.21099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.21155: stderr chunk (state=3): >>><<< 46400 1727204636.21158: stdout chunk (state=3): >>><<< 46400 1727204636.21178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204636.1833222-55347-258055529467229=/root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204636.21224: variable 'ansible_module_compression' from source: unknown 46400 1727204636.21265: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 46400 1727204636.21315: variable 'ansible_facts' from source: unknown 46400 1727204636.21447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/AnsiballZ_package_facts.py 46400 1727204636.21569: Sending initial data 46400 1727204636.21573: Sent initial data (162 bytes) 46400 1727204636.22272: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.22285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.22316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204636.22329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.22386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.22397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.22449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.24163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204636.24195: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204636.24237: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp952yx51b /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/AnsiballZ_package_facts.py <<< 46400 1727204636.24267: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204636.25921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.26038: stderr chunk (state=3): >>><<< 46400 1727204636.26043: stdout chunk (state=3): >>><<< 46400 1727204636.26061: done transferring module to remote 46400 1727204636.26075: _low_level_execute_command(): starting 46400 1727204636.26079: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/ /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/AnsiballZ_package_facts.py && sleep 0' 46400 1727204636.26549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.26553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.26591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.26607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.26623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.26663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.26682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.26726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.28447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.28502: stderr chunk (state=3): >>><<< 46400 1727204636.28506: stdout chunk (state=3): >>><<< 46400 1727204636.28520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204636.28523: _low_level_execute_command(): starting 46400 1727204636.28527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/AnsiballZ_package_facts.py && sleep 0' 46400 1727204636.29004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.29008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.29042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.29054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.29111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.29122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.29177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.75361: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 46400 1727204636.75403: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 46400 1727204636.75410: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 46400 1727204636.75414: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 46400 1727204636.75441: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 46400 1727204636.75458: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 46400 1727204636.75481: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 46400 1727204636.75513: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 46400 1727204636.75518: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 46400 1727204636.75542: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 46400 1727204636.75548: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 46400 1727204636.75587: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 46400 1727204636.75600: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 46400 1727204636.77117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204636.77180: stderr chunk (state=3): >>><<< 46400 1727204636.77183: stdout chunk (state=3): >>><<< 46400 1727204636.77228: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204636.78727: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204636.78747: _low_level_execute_command(): starting 46400 1727204636.78751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204636.1833222-55347-258055529467229/ > /dev/null 2>&1 && sleep 0' 46400 1727204636.79237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204636.79244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204636.79281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204636.79293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204636.79342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204636.79355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204636.79377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204636.79416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204636.81222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204636.81282: stderr chunk (state=3): >>><<< 46400 1727204636.81285: stdout chunk (state=3): >>><<< 46400 1727204636.81298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204636.81303: handler run complete 46400 1727204636.81852: variable 'ansible_facts' from source: unknown 46400 1727204636.82158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.83337: variable 'ansible_facts' from source: unknown 46400 1727204636.83616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.84053: attempt loop complete, returning result 46400 1727204636.84065: _execute() done 46400 1727204636.84069: dumping result to json 46400 1727204636.84194: done dumping result, returning 46400 1727204636.84202: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1303-fda8-0000000026f1] 46400 1727204636.84208: sending task result for task 0affcd87-79f5-1303-fda8-0000000026f1 46400 1727204636.85742: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026f1 46400 1727204636.85745: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204636.85851: no more pending results, returning what we have 46400 1727204636.85854: results queue empty 46400 1727204636.85854: checking for any_errors_fatal 46400 1727204636.85858: done checking for any_errors_fatal 46400 1727204636.85858: checking for max_fail_percentage 46400 1727204636.85862: done checking for max_fail_percentage 46400 1727204636.85862: checking to see if all hosts have failed and the running result is not ok 46400 1727204636.85863: done checking to see if all hosts have failed 46400 1727204636.85865: getting the remaining hosts for this loop 46400 1727204636.85866: done getting the remaining hosts for this loop 46400 1727204636.85869: getting the next task for host managed-node2 46400 1727204636.85875: done getting next task for host managed-node2 46400 1727204636.85878: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204636.85881: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204636.85890: getting variables 46400 1727204636.85891: in VariableManager get_vars() 46400 1727204636.85919: Calling all_inventory to load vars for managed-node2 46400 1727204636.85921: Calling groups_inventory to load vars for managed-node2 46400 1727204636.85923: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204636.85930: Calling all_plugins_play to load vars for managed-node2 46400 1727204636.85931: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204636.85933: Calling groups_plugins_play to load vars for managed-node2 46400 1727204636.86667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.87606: done with get_vars() 46400 1727204636.87623: done getting variables 46400 1727204636.87673: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:56 -0400 (0:00:00.739) 0:02:07.161 ***** 46400 1727204636.87704: entering _queue_task() for managed-node2/debug 46400 1727204636.87946: worker is 1 (out of 1 available) 46400 1727204636.87963: exiting _queue_task() for managed-node2/debug 46400 1727204636.87978: done queuing things up, now waiting for results queue to drain 46400 1727204636.87979: waiting for pending results... 46400 1727204636.88171: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 46400 1727204636.88269: in run() - task 0affcd87-79f5-1303-fda8-000000002695 46400 1727204636.88279: variable 'ansible_search_path' from source: unknown 46400 1727204636.88285: variable 'ansible_search_path' from source: unknown 46400 1727204636.88314: calling self._execute() 46400 1727204636.88392: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.88397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.88408: variable 'omit' from source: magic vars 46400 1727204636.88681: variable 'ansible_distribution_major_version' from source: facts 46400 1727204636.88691: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204636.88697: variable 'omit' from source: magic vars 46400 1727204636.88741: variable 'omit' from source: magic vars 46400 1727204636.88813: variable 'network_provider' from source: set_fact 46400 1727204636.88827: variable 'omit' from source: magic vars 46400 1727204636.88866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204636.88897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204636.88915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204636.88927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204636.88936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204636.88967: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204636.88970: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.88973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.89040: Set connection var ansible_shell_type to sh 46400 1727204636.89048: Set connection var ansible_shell_executable to /bin/sh 46400 1727204636.89055: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204636.89066: Set connection var ansible_connection to ssh 46400 1727204636.89072: Set connection var ansible_pipelining to False 46400 1727204636.89077: Set connection var ansible_timeout to 10 46400 1727204636.89098: variable 'ansible_shell_executable' from source: unknown 46400 1727204636.89101: variable 'ansible_connection' from source: unknown 46400 1727204636.89104: variable 'ansible_module_compression' from source: unknown 46400 1727204636.89106: variable 'ansible_shell_type' from source: unknown 46400 1727204636.89109: variable 'ansible_shell_executable' from source: unknown 46400 1727204636.89111: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.89114: variable 'ansible_pipelining' from source: unknown 46400 1727204636.89116: variable 'ansible_timeout' from source: unknown 46400 1727204636.89119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.89225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204636.89233: variable 'omit' from source: magic vars 46400 1727204636.89237: starting attempt loop 46400 1727204636.89241: running the handler 46400 1727204636.89279: handler run complete 46400 1727204636.89293: attempt loop complete, returning result 46400 1727204636.89296: _execute() done 46400 1727204636.89299: dumping result to json 46400 1727204636.89301: done dumping result, returning 46400 1727204636.89310: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1303-fda8-000000002695] 46400 1727204636.89315: sending task result for task 0affcd87-79f5-1303-fda8-000000002695 46400 1727204636.89399: done sending task result for task 0affcd87-79f5-1303-fda8-000000002695 46400 1727204636.89402: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 46400 1727204636.89485: no more pending results, returning what we have 46400 1727204636.89488: results queue empty 46400 1727204636.89489: checking for any_errors_fatal 46400 1727204636.89500: done checking for any_errors_fatal 46400 1727204636.89501: checking for max_fail_percentage 46400 1727204636.89502: done checking for max_fail_percentage 46400 1727204636.89503: checking to see if all hosts have failed and the running result is not ok 46400 1727204636.89504: done checking to see if all hosts have failed 46400 1727204636.89505: getting the remaining hosts for this loop 46400 1727204636.89506: done getting the remaining hosts for this loop 46400 1727204636.89510: getting the next task for host managed-node2 46400 1727204636.89519: done getting next task for host managed-node2 46400 1727204636.89523: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204636.89527: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204636.89540: getting variables 46400 1727204636.89542: in VariableManager get_vars() 46400 1727204636.89584: Calling all_inventory to load vars for managed-node2 46400 1727204636.89587: Calling groups_inventory to load vars for managed-node2 46400 1727204636.89590: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204636.89599: Calling all_plugins_play to load vars for managed-node2 46400 1727204636.89601: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204636.89604: Calling groups_plugins_play to load vars for managed-node2 46400 1727204636.90720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.92408: done with get_vars() 46400 1727204636.92438: done getting variables 46400 1727204636.92508: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:56 -0400 (0:00:00.048) 0:02:07.210 ***** 46400 1727204636.92554: entering _queue_task() for managed-node2/fail 46400 1727204636.92914: worker is 1 (out of 1 available) 46400 1727204636.92928: exiting _queue_task() for managed-node2/fail 46400 1727204636.92941: done queuing things up, now waiting for results queue to drain 46400 1727204636.92943: waiting for pending results... 46400 1727204636.93257: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 46400 1727204636.93377: in run() - task 0affcd87-79f5-1303-fda8-000000002696 46400 1727204636.93387: variable 'ansible_search_path' from source: unknown 46400 1727204636.93391: variable 'ansible_search_path' from source: unknown 46400 1727204636.93421: calling self._execute() 46400 1727204636.93501: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.93506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.93513: variable 'omit' from source: magic vars 46400 1727204636.93802: variable 'ansible_distribution_major_version' from source: facts 46400 1727204636.93811: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204636.93901: variable 'network_state' from source: role '' defaults 46400 1727204636.93910: Evaluated conditional (network_state != {}): False 46400 1727204636.93913: when evaluation is False, skipping this task 46400 1727204636.93916: _execute() done 46400 1727204636.93918: dumping result to json 46400 1727204636.93920: done dumping result, returning 46400 1727204636.93927: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1303-fda8-000000002696] 46400 1727204636.93933: sending task result for task 0affcd87-79f5-1303-fda8-000000002696 46400 1727204636.94026: done sending task result for task 0affcd87-79f5-1303-fda8-000000002696 46400 1727204636.94029: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204636.94078: no more pending results, returning what we have 46400 1727204636.94082: results queue empty 46400 1727204636.94083: checking for any_errors_fatal 46400 1727204636.94090: done checking for any_errors_fatal 46400 1727204636.94091: checking for max_fail_percentage 46400 1727204636.94092: done checking for max_fail_percentage 46400 1727204636.94093: checking to see if all hosts have failed and the running result is not ok 46400 1727204636.94094: done checking to see if all hosts have failed 46400 1727204636.94095: getting the remaining hosts for this loop 46400 1727204636.94096: done getting the remaining hosts for this loop 46400 1727204636.94100: getting the next task for host managed-node2 46400 1727204636.94110: done getting next task for host managed-node2 46400 1727204636.94115: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204636.94120: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204636.94152: getting variables 46400 1727204636.94153: in VariableManager get_vars() 46400 1727204636.94202: Calling all_inventory to load vars for managed-node2 46400 1727204636.94206: Calling groups_inventory to load vars for managed-node2 46400 1727204636.94208: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204636.94220: Calling all_plugins_play to load vars for managed-node2 46400 1727204636.94222: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204636.94225: Calling groups_plugins_play to load vars for managed-node2 46400 1727204636.95077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.96140: done with get_vars() 46400 1727204636.96159: done getting variables 46400 1727204636.96208: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:56 -0400 (0:00:00.036) 0:02:07.246 ***** 46400 1727204636.96234: entering _queue_task() for managed-node2/fail 46400 1727204636.96500: worker is 1 (out of 1 available) 46400 1727204636.96514: exiting _queue_task() for managed-node2/fail 46400 1727204636.96527: done queuing things up, now waiting for results queue to drain 46400 1727204636.96529: waiting for pending results... 46400 1727204636.96718: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 46400 1727204636.96818: in run() - task 0affcd87-79f5-1303-fda8-000000002697 46400 1727204636.96831: variable 'ansible_search_path' from source: unknown 46400 1727204636.96835: variable 'ansible_search_path' from source: unknown 46400 1727204636.96870: calling self._execute() 46400 1727204636.96946: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204636.96950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204636.96957: variable 'omit' from source: magic vars 46400 1727204636.97250: variable 'ansible_distribution_major_version' from source: facts 46400 1727204636.97260: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204636.97351: variable 'network_state' from source: role '' defaults 46400 1727204636.97359: Evaluated conditional (network_state != {}): False 46400 1727204636.97366: when evaluation is False, skipping this task 46400 1727204636.97370: _execute() done 46400 1727204636.97372: dumping result to json 46400 1727204636.97375: done dumping result, returning 46400 1727204636.97382: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1303-fda8-000000002697] 46400 1727204636.97388: sending task result for task 0affcd87-79f5-1303-fda8-000000002697 46400 1727204636.97483: done sending task result for task 0affcd87-79f5-1303-fda8-000000002697 46400 1727204636.97486: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204636.97551: no more pending results, returning what we have 46400 1727204636.97555: results queue empty 46400 1727204636.97556: checking for any_errors_fatal 46400 1727204636.97570: done checking for any_errors_fatal 46400 1727204636.97571: checking for max_fail_percentage 46400 1727204636.97573: done checking for max_fail_percentage 46400 1727204636.97574: checking to see if all hosts have failed and the running result is not ok 46400 1727204636.97575: done checking to see if all hosts have failed 46400 1727204636.97575: getting the remaining hosts for this loop 46400 1727204636.97577: done getting the remaining hosts for this loop 46400 1727204636.97581: getting the next task for host managed-node2 46400 1727204636.97589: done getting next task for host managed-node2 46400 1727204636.97592: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204636.97597: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204636.97631: getting variables 46400 1727204636.97633: in VariableManager get_vars() 46400 1727204636.97675: Calling all_inventory to load vars for managed-node2 46400 1727204636.97678: Calling groups_inventory to load vars for managed-node2 46400 1727204636.97681: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204636.97690: Calling all_plugins_play to load vars for managed-node2 46400 1727204636.97692: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204636.97694: Calling groups_plugins_play to load vars for managed-node2 46400 1727204636.98509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204636.99444: done with get_vars() 46400 1727204636.99466: done getting variables 46400 1727204636.99510: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:56 -0400 (0:00:00.033) 0:02:07.279 ***** 46400 1727204636.99536: entering _queue_task() for managed-node2/fail 46400 1727204636.99788: worker is 1 (out of 1 available) 46400 1727204636.99803: exiting _queue_task() for managed-node2/fail 46400 1727204636.99816: done queuing things up, now waiting for results queue to drain 46400 1727204636.99817: waiting for pending results... 46400 1727204637.00013: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 46400 1727204637.00129: in run() - task 0affcd87-79f5-1303-fda8-000000002698 46400 1727204637.00140: variable 'ansible_search_path' from source: unknown 46400 1727204637.00143: variable 'ansible_search_path' from source: unknown 46400 1727204637.00178: calling self._execute() 46400 1727204637.00251: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.00257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.00270: variable 'omit' from source: magic vars 46400 1727204637.00547: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.00558: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.00686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.02358: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.02416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.02452: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.02479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.02500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.02567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.02586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.02604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.02633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.02646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.02715: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.02730: Evaluated conditional (ansible_distribution_major_version | int > 9): False 46400 1727204637.02733: when evaluation is False, skipping this task 46400 1727204637.02738: _execute() done 46400 1727204637.02741: dumping result to json 46400 1727204637.02743: done dumping result, returning 46400 1727204637.02747: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1303-fda8-000000002698] 46400 1727204637.02750: sending task result for task 0affcd87-79f5-1303-fda8-000000002698 46400 1727204637.02842: done sending task result for task 0affcd87-79f5-1303-fda8-000000002698 46400 1727204637.02845: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 46400 1727204637.02908: no more pending results, returning what we have 46400 1727204637.02913: results queue empty 46400 1727204637.02914: checking for any_errors_fatal 46400 1727204637.02923: done checking for any_errors_fatal 46400 1727204637.02924: checking for max_fail_percentage 46400 1727204637.02925: done checking for max_fail_percentage 46400 1727204637.02926: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.02927: done checking to see if all hosts have failed 46400 1727204637.02928: getting the remaining hosts for this loop 46400 1727204637.02930: done getting the remaining hosts for this loop 46400 1727204637.02934: getting the next task for host managed-node2 46400 1727204637.02942: done getting next task for host managed-node2 46400 1727204637.02947: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204637.02952: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.02989: getting variables 46400 1727204637.02991: in VariableManager get_vars() 46400 1727204637.03034: Calling all_inventory to load vars for managed-node2 46400 1727204637.03037: Calling groups_inventory to load vars for managed-node2 46400 1727204637.03039: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.03048: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.03050: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.03052: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.04094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.09414: done with get_vars() 46400 1727204637.09436: done getting variables 46400 1727204637.09479: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.099) 0:02:07.379 ***** 46400 1727204637.09501: entering _queue_task() for managed-node2/dnf 46400 1727204637.09757: worker is 1 (out of 1 available) 46400 1727204637.09774: exiting _queue_task() for managed-node2/dnf 46400 1727204637.09787: done queuing things up, now waiting for results queue to drain 46400 1727204637.09789: waiting for pending results... 46400 1727204637.09990: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 46400 1727204637.10100: in run() - task 0affcd87-79f5-1303-fda8-000000002699 46400 1727204637.10111: variable 'ansible_search_path' from source: unknown 46400 1727204637.10115: variable 'ansible_search_path' from source: unknown 46400 1727204637.10146: calling self._execute() 46400 1727204637.10235: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.10241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.10253: variable 'omit' from source: magic vars 46400 1727204637.10540: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.10549: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.10695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.12399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.12462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.12492: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.12522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.12543: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.12603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.12627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.12647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.12677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.12688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.12792: variable 'ansible_distribution' from source: facts 46400 1727204637.12795: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.12807: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 46400 1727204637.12894: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.12979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.12996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.13013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.13038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.13048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.13081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.13097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.13114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.13139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.13149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.13182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.13198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.13214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.13239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.13250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.13355: variable 'network_connections' from source: include params 46400 1727204637.13365: variable 'interface' from source: play vars 46400 1727204637.13420: variable 'interface' from source: play vars 46400 1727204637.13473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204637.13597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204637.13627: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204637.13650: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204637.13677: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204637.13714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204637.13727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204637.13749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.13772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204637.13807: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204637.13971: variable 'network_connections' from source: include params 46400 1727204637.13974: variable 'interface' from source: play vars 46400 1727204637.14018: variable 'interface' from source: play vars 46400 1727204637.14043: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204637.14048: when evaluation is False, skipping this task 46400 1727204637.14051: _execute() done 46400 1727204637.14054: dumping result to json 46400 1727204637.14056: done dumping result, returning 46400 1727204637.14059: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-000000002699] 46400 1727204637.14061: sending task result for task 0affcd87-79f5-1303-fda8-000000002699 46400 1727204637.14162: done sending task result for task 0affcd87-79f5-1303-fda8-000000002699 46400 1727204637.14166: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204637.14212: no more pending results, returning what we have 46400 1727204637.14216: results queue empty 46400 1727204637.14217: checking for any_errors_fatal 46400 1727204637.14228: done checking for any_errors_fatal 46400 1727204637.14230: checking for max_fail_percentage 46400 1727204637.14231: done checking for max_fail_percentage 46400 1727204637.14232: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.14233: done checking to see if all hosts have failed 46400 1727204637.14234: getting the remaining hosts for this loop 46400 1727204637.14235: done getting the remaining hosts for this loop 46400 1727204637.14239: getting the next task for host managed-node2 46400 1727204637.14248: done getting next task for host managed-node2 46400 1727204637.14259: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204637.14266: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.14297: getting variables 46400 1727204637.14298: in VariableManager get_vars() 46400 1727204637.14342: Calling all_inventory to load vars for managed-node2 46400 1727204637.14344: Calling groups_inventory to load vars for managed-node2 46400 1727204637.14347: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.14356: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.14358: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.14368: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.15236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.16183: done with get_vars() 46400 1727204637.16204: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 46400 1727204637.16258: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.067) 0:02:07.447 ***** 46400 1727204637.16285: entering _queue_task() for managed-node2/yum 46400 1727204637.16526: worker is 1 (out of 1 available) 46400 1727204637.16541: exiting _queue_task() for managed-node2/yum 46400 1727204637.16554: done queuing things up, now waiting for results queue to drain 46400 1727204637.16556: waiting for pending results... 46400 1727204637.16752: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 46400 1727204637.16853: in run() - task 0affcd87-79f5-1303-fda8-00000000269a 46400 1727204637.16868: variable 'ansible_search_path' from source: unknown 46400 1727204637.16878: variable 'ansible_search_path' from source: unknown 46400 1727204637.16909: calling self._execute() 46400 1727204637.16994: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.16999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.17006: variable 'omit' from source: magic vars 46400 1727204637.17310: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.17316: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.17438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.19099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.19447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.19478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.19505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.19526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.19584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.19606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.19626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.19654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.19670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.19743: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.19755: Evaluated conditional (ansible_distribution_major_version | int < 8): False 46400 1727204637.19758: when evaluation is False, skipping this task 46400 1727204637.19765: _execute() done 46400 1727204637.19767: dumping result to json 46400 1727204637.19770: done dumping result, returning 46400 1727204637.19774: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000269a] 46400 1727204637.19780: sending task result for task 0affcd87-79f5-1303-fda8-00000000269a 46400 1727204637.19872: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269a 46400 1727204637.19875: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 46400 1727204637.19925: no more pending results, returning what we have 46400 1727204637.19929: results queue empty 46400 1727204637.19930: checking for any_errors_fatal 46400 1727204637.19937: done checking for any_errors_fatal 46400 1727204637.19938: checking for max_fail_percentage 46400 1727204637.19939: done checking for max_fail_percentage 46400 1727204637.19940: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.19941: done checking to see if all hosts have failed 46400 1727204637.19942: getting the remaining hosts for this loop 46400 1727204637.19944: done getting the remaining hosts for this loop 46400 1727204637.19947: getting the next task for host managed-node2 46400 1727204637.19956: done getting next task for host managed-node2 46400 1727204637.19962: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204637.19969: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.20005: getting variables 46400 1727204637.20007: in VariableManager get_vars() 46400 1727204637.20050: Calling all_inventory to load vars for managed-node2 46400 1727204637.20053: Calling groups_inventory to load vars for managed-node2 46400 1727204637.20055: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.20070: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.20072: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.20075: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.21080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.22003: done with get_vars() 46400 1727204637.22019: done getting variables 46400 1727204637.22069: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.058) 0:02:07.505 ***** 46400 1727204637.22096: entering _queue_task() for managed-node2/fail 46400 1727204637.22336: worker is 1 (out of 1 available) 46400 1727204637.22350: exiting _queue_task() for managed-node2/fail 46400 1727204637.22370: done queuing things up, now waiting for results queue to drain 46400 1727204637.22372: waiting for pending results... 46400 1727204637.22556: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 46400 1727204637.22678: in run() - task 0affcd87-79f5-1303-fda8-00000000269b 46400 1727204637.22690: variable 'ansible_search_path' from source: unknown 46400 1727204637.22695: variable 'ansible_search_path' from source: unknown 46400 1727204637.22727: calling self._execute() 46400 1727204637.22806: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.22811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.22817: variable 'omit' from source: magic vars 46400 1727204637.23113: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.23123: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.23214: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.23347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.24989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.25051: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.25084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.25113: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.25133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.25194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.25215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.25237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.25270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.25282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.25315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.25335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.25352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.25387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.25398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.25430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.25447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.25467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.25494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.25504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.25628: variable 'network_connections' from source: include params 46400 1727204637.25642: variable 'interface' from source: play vars 46400 1727204637.25691: variable 'interface' from source: play vars 46400 1727204637.25742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204637.25861: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204637.25904: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204637.25927: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204637.25950: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204637.25987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204637.26002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204637.26020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.26039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204637.26081: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204637.26246: variable 'network_connections' from source: include params 46400 1727204637.26251: variable 'interface' from source: play vars 46400 1727204637.26298: variable 'interface' from source: play vars 46400 1727204637.26322: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204637.26326: when evaluation is False, skipping this task 46400 1727204637.26329: _execute() done 46400 1727204637.26331: dumping result to json 46400 1727204637.26334: done dumping result, returning 46400 1727204637.26339: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000269b] 46400 1727204637.26345: sending task result for task 0affcd87-79f5-1303-fda8-00000000269b 46400 1727204637.26448: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269b 46400 1727204637.26451: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204637.26511: no more pending results, returning what we have 46400 1727204637.26515: results queue empty 46400 1727204637.26517: checking for any_errors_fatal 46400 1727204637.26529: done checking for any_errors_fatal 46400 1727204637.26530: checking for max_fail_percentage 46400 1727204637.26532: done checking for max_fail_percentage 46400 1727204637.26533: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.26534: done checking to see if all hosts have failed 46400 1727204637.26535: getting the remaining hosts for this loop 46400 1727204637.26537: done getting the remaining hosts for this loop 46400 1727204637.26541: getting the next task for host managed-node2 46400 1727204637.26550: done getting next task for host managed-node2 46400 1727204637.26557: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 46400 1727204637.26563: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.26592: getting variables 46400 1727204637.26594: in VariableManager get_vars() 46400 1727204637.26643: Calling all_inventory to load vars for managed-node2 46400 1727204637.26645: Calling groups_inventory to load vars for managed-node2 46400 1727204637.26647: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.26657: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.26659: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.26661: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.27523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.28475: done with get_vars() 46400 1727204637.28497: done getting variables 46400 1727204637.28542: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.064) 0:02:07.570 ***** 46400 1727204637.28571: entering _queue_task() for managed-node2/package 46400 1727204637.28827: worker is 1 (out of 1 available) 46400 1727204637.28841: exiting _queue_task() for managed-node2/package 46400 1727204637.28855: done queuing things up, now waiting for results queue to drain 46400 1727204637.28857: waiting for pending results... 46400 1727204637.29053: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 46400 1727204637.29166: in run() - task 0affcd87-79f5-1303-fda8-00000000269c 46400 1727204637.29181: variable 'ansible_search_path' from source: unknown 46400 1727204637.29186: variable 'ansible_search_path' from source: unknown 46400 1727204637.29216: calling self._execute() 46400 1727204637.29301: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.29310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.29318: variable 'omit' from source: magic vars 46400 1727204637.29619: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.29629: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.29770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204637.29967: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204637.30004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204637.30030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204637.30099: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204637.30185: variable 'network_packages' from source: role '' defaults 46400 1727204637.30265: variable '__network_provider_setup' from source: role '' defaults 46400 1727204637.30271: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204637.30321: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204637.30326: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204637.30380: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204637.30500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.32253: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.32303: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.32332: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.32358: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.32382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.32444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.32468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.32486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.32517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.32527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.32563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.32585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.32601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.32633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.32643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.32811: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204637.32893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.32910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.32931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.32956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.32971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.33040: variable 'ansible_python' from source: facts 46400 1727204637.33053: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204637.33115: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204637.33175: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204637.33259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.33283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.33301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.33325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.33336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.33377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.33393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.33410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.33434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.33446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.33543: variable 'network_connections' from source: include params 46400 1727204637.33549: variable 'interface' from source: play vars 46400 1727204637.33625: variable 'interface' from source: play vars 46400 1727204637.33680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204637.33703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204637.33724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.33745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204637.33786: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.33972: variable 'network_connections' from source: include params 46400 1727204637.33976: variable 'interface' from source: play vars 46400 1727204637.34048: variable 'interface' from source: play vars 46400 1727204637.34075: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204637.34134: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.34338: variable 'network_connections' from source: include params 46400 1727204637.34341: variable 'interface' from source: play vars 46400 1727204637.34390: variable 'interface' from source: play vars 46400 1727204637.34406: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204637.34463: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204637.34663: variable 'network_connections' from source: include params 46400 1727204637.34666: variable 'interface' from source: play vars 46400 1727204637.34714: variable 'interface' from source: play vars 46400 1727204637.34749: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204637.34798: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204637.34803: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204637.34847: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204637.34989: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204637.35295: variable 'network_connections' from source: include params 46400 1727204637.35299: variable 'interface' from source: play vars 46400 1727204637.35344: variable 'interface' from source: play vars 46400 1727204637.35350: variable 'ansible_distribution' from source: facts 46400 1727204637.35353: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.35360: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.35375: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204637.35486: variable 'ansible_distribution' from source: facts 46400 1727204637.35489: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.35494: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.35504: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204637.35616: variable 'ansible_distribution' from source: facts 46400 1727204637.35619: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.35628: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.35654: variable 'network_provider' from source: set_fact 46400 1727204637.35669: variable 'ansible_facts' from source: unknown 46400 1727204637.36073: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 46400 1727204637.36077: when evaluation is False, skipping this task 46400 1727204637.36079: _execute() done 46400 1727204637.36081: dumping result to json 46400 1727204637.36083: done dumping result, returning 46400 1727204637.36089: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1303-fda8-00000000269c] 46400 1727204637.36094: sending task result for task 0affcd87-79f5-1303-fda8-00000000269c 46400 1727204637.36196: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269c 46400 1727204637.36199: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 46400 1727204637.36244: no more pending results, returning what we have 46400 1727204637.36248: results queue empty 46400 1727204637.36249: checking for any_errors_fatal 46400 1727204637.36258: done checking for any_errors_fatal 46400 1727204637.36258: checking for max_fail_percentage 46400 1727204637.36260: done checking for max_fail_percentage 46400 1727204637.36261: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.36263: done checking to see if all hosts have failed 46400 1727204637.36265: getting the remaining hosts for this loop 46400 1727204637.36267: done getting the remaining hosts for this loop 46400 1727204637.36271: getting the next task for host managed-node2 46400 1727204637.36283: done getting next task for host managed-node2 46400 1727204637.36291: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204637.36296: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.36326: getting variables 46400 1727204637.36328: in VariableManager get_vars() 46400 1727204637.36379: Calling all_inventory to load vars for managed-node2 46400 1727204637.36382: Calling groups_inventory to load vars for managed-node2 46400 1727204637.36388: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.36401: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.36404: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.36406: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.37412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.38343: done with get_vars() 46400 1727204637.38360: done getting variables 46400 1727204637.38407: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.098) 0:02:07.668 ***** 46400 1727204637.38433: entering _queue_task() for managed-node2/package 46400 1727204637.38690: worker is 1 (out of 1 available) 46400 1727204637.38704: exiting _queue_task() for managed-node2/package 46400 1727204637.38717: done queuing things up, now waiting for results queue to drain 46400 1727204637.38719: waiting for pending results... 46400 1727204637.38918: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 46400 1727204637.39023: in run() - task 0affcd87-79f5-1303-fda8-00000000269d 46400 1727204637.39034: variable 'ansible_search_path' from source: unknown 46400 1727204637.39038: variable 'ansible_search_path' from source: unknown 46400 1727204637.39071: calling self._execute() 46400 1727204637.39157: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.39160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.39174: variable 'omit' from source: magic vars 46400 1727204637.39459: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.39473: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.39566: variable 'network_state' from source: role '' defaults 46400 1727204637.39576: Evaluated conditional (network_state != {}): False 46400 1727204637.39579: when evaluation is False, skipping this task 46400 1727204637.39581: _execute() done 46400 1727204637.39585: dumping result to json 46400 1727204637.39587: done dumping result, returning 46400 1727204637.39594: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1303-fda8-00000000269d] 46400 1727204637.39600: sending task result for task 0affcd87-79f5-1303-fda8-00000000269d 46400 1727204637.39702: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269d 46400 1727204637.39705: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204637.39749: no more pending results, returning what we have 46400 1727204637.39753: results queue empty 46400 1727204637.39754: checking for any_errors_fatal 46400 1727204637.39763: done checking for any_errors_fatal 46400 1727204637.39765: checking for max_fail_percentage 46400 1727204637.39767: done checking for max_fail_percentage 46400 1727204637.39768: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.39769: done checking to see if all hosts have failed 46400 1727204637.39770: getting the remaining hosts for this loop 46400 1727204637.39771: done getting the remaining hosts for this loop 46400 1727204637.39775: getting the next task for host managed-node2 46400 1727204637.39784: done getting next task for host managed-node2 46400 1727204637.39788: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204637.39794: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.39826: getting variables 46400 1727204637.39828: in VariableManager get_vars() 46400 1727204637.39875: Calling all_inventory to load vars for managed-node2 46400 1727204637.39878: Calling groups_inventory to load vars for managed-node2 46400 1727204637.39880: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.39891: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.39894: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.39896: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.40733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.41816: done with get_vars() 46400 1727204637.41832: done getting variables 46400 1727204637.41881: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.034) 0:02:07.703 ***** 46400 1727204637.41908: entering _queue_task() for managed-node2/package 46400 1727204637.42149: worker is 1 (out of 1 available) 46400 1727204637.42167: exiting _queue_task() for managed-node2/package 46400 1727204637.42180: done queuing things up, now waiting for results queue to drain 46400 1727204637.42182: waiting for pending results... 46400 1727204637.42375: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 46400 1727204637.42469: in run() - task 0affcd87-79f5-1303-fda8-00000000269e 46400 1727204637.42487: variable 'ansible_search_path' from source: unknown 46400 1727204637.42491: variable 'ansible_search_path' from source: unknown 46400 1727204637.42521: calling self._execute() 46400 1727204637.42610: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.42614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.42624: variable 'omit' from source: magic vars 46400 1727204637.42925: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.42935: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.43027: variable 'network_state' from source: role '' defaults 46400 1727204637.43036: Evaluated conditional (network_state != {}): False 46400 1727204637.43039: when evaluation is False, skipping this task 46400 1727204637.43041: _execute() done 46400 1727204637.43044: dumping result to json 46400 1727204637.43046: done dumping result, returning 46400 1727204637.43054: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1303-fda8-00000000269e] 46400 1727204637.43062: sending task result for task 0affcd87-79f5-1303-fda8-00000000269e 46400 1727204637.43154: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269e 46400 1727204637.43157: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204637.43206: no more pending results, returning what we have 46400 1727204637.43210: results queue empty 46400 1727204637.43211: checking for any_errors_fatal 46400 1727204637.43220: done checking for any_errors_fatal 46400 1727204637.43220: checking for max_fail_percentage 46400 1727204637.43222: done checking for max_fail_percentage 46400 1727204637.43223: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.43224: done checking to see if all hosts have failed 46400 1727204637.43225: getting the remaining hosts for this loop 46400 1727204637.43226: done getting the remaining hosts for this loop 46400 1727204637.43230: getting the next task for host managed-node2 46400 1727204637.43240: done getting next task for host managed-node2 46400 1727204637.43244: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204637.43250: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.43281: getting variables 46400 1727204637.43283: in VariableManager get_vars() 46400 1727204637.43324: Calling all_inventory to load vars for managed-node2 46400 1727204637.43327: Calling groups_inventory to load vars for managed-node2 46400 1727204637.43329: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.43339: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.43341: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.43345: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.44169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.45108: done with get_vars() 46400 1727204637.45127: done getting variables 46400 1727204637.45178: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.032) 0:02:07.736 ***** 46400 1727204637.45207: entering _queue_task() for managed-node2/service 46400 1727204637.45468: worker is 1 (out of 1 available) 46400 1727204637.45482: exiting _queue_task() for managed-node2/service 46400 1727204637.45495: done queuing things up, now waiting for results queue to drain 46400 1727204637.45496: waiting for pending results... 46400 1727204637.45693: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 46400 1727204637.45810: in run() - task 0affcd87-79f5-1303-fda8-00000000269f 46400 1727204637.45822: variable 'ansible_search_path' from source: unknown 46400 1727204637.45826: variable 'ansible_search_path' from source: unknown 46400 1727204637.45857: calling self._execute() 46400 1727204637.45937: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.45941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.45953: variable 'omit' from source: magic vars 46400 1727204637.46239: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.46249: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.46342: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.46481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.48122: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.48179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.48206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.48233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.48255: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.48316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.48336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.48366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.48393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.48403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.48436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.48461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.48478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.48505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.48515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.48544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.48568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.48585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.48610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.48621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.48744: variable 'network_connections' from source: include params 46400 1727204637.48753: variable 'interface' from source: play vars 46400 1727204637.48807: variable 'interface' from source: play vars 46400 1727204637.48862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204637.48974: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204637.49289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204637.49312: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204637.49336: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204637.49368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204637.49385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204637.49402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.49420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204637.49463: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204637.49621: variable 'network_connections' from source: include params 46400 1727204637.49624: variable 'interface' from source: play vars 46400 1727204637.49677: variable 'interface' from source: play vars 46400 1727204637.49696: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 46400 1727204637.49700: when evaluation is False, skipping this task 46400 1727204637.49702: _execute() done 46400 1727204637.49705: dumping result to json 46400 1727204637.49707: done dumping result, returning 46400 1727204637.49713: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1303-fda8-00000000269f] 46400 1727204637.49718: sending task result for task 0affcd87-79f5-1303-fda8-00000000269f 46400 1727204637.49815: done sending task result for task 0affcd87-79f5-1303-fda8-00000000269f 46400 1727204637.49824: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 46400 1727204637.49877: no more pending results, returning what we have 46400 1727204637.49881: results queue empty 46400 1727204637.49882: checking for any_errors_fatal 46400 1727204637.49889: done checking for any_errors_fatal 46400 1727204637.49889: checking for max_fail_percentage 46400 1727204637.49891: done checking for max_fail_percentage 46400 1727204637.49892: checking to see if all hosts have failed and the running result is not ok 46400 1727204637.49893: done checking to see if all hosts have failed 46400 1727204637.49894: getting the remaining hosts for this loop 46400 1727204637.49896: done getting the remaining hosts for this loop 46400 1727204637.49900: getting the next task for host managed-node2 46400 1727204637.49908: done getting next task for host managed-node2 46400 1727204637.49912: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204637.49917: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204637.49945: getting variables 46400 1727204637.49947: in VariableManager get_vars() 46400 1727204637.50002: Calling all_inventory to load vars for managed-node2 46400 1727204637.50005: Calling groups_inventory to load vars for managed-node2 46400 1727204637.50007: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204637.50017: Calling all_plugins_play to load vars for managed-node2 46400 1727204637.50019: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204637.50022: Calling groups_plugins_play to load vars for managed-node2 46400 1727204637.51061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204637.51983: done with get_vars() 46400 1727204637.52000: done getting variables 46400 1727204637.52047: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.068) 0:02:07.805 ***** 46400 1727204637.52076: entering _queue_task() for managed-node2/service 46400 1727204637.52324: worker is 1 (out of 1 available) 46400 1727204637.52337: exiting _queue_task() for managed-node2/service 46400 1727204637.52348: done queuing things up, now waiting for results queue to drain 46400 1727204637.52350: waiting for pending results... 46400 1727204637.52543: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 46400 1727204637.52646: in run() - task 0affcd87-79f5-1303-fda8-0000000026a0 46400 1727204637.52657: variable 'ansible_search_path' from source: unknown 46400 1727204637.52663: variable 'ansible_search_path' from source: unknown 46400 1727204637.52694: calling self._execute() 46400 1727204637.52772: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.52777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.52786: variable 'omit' from source: magic vars 46400 1727204637.53073: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.53083: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204637.53203: variable 'network_provider' from source: set_fact 46400 1727204637.53207: variable 'network_state' from source: role '' defaults 46400 1727204637.53216: Evaluated conditional (network_provider == "nm" or network_state != {}): True 46400 1727204637.53222: variable 'omit' from source: magic vars 46400 1727204637.53272: variable 'omit' from source: magic vars 46400 1727204637.53293: variable 'network_service_name' from source: role '' defaults 46400 1727204637.53343: variable 'network_service_name' from source: role '' defaults 46400 1727204637.53415: variable '__network_provider_setup' from source: role '' defaults 46400 1727204637.53419: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204637.53469: variable '__network_service_name_default_nm' from source: role '' defaults 46400 1727204637.53477: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204637.53521: variable '__network_packages_default_nm' from source: role '' defaults 46400 1727204637.53681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204637.55270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204637.55325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204637.55354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204637.55382: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204637.55404: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204637.55468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.55488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.55506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.55536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.55547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.55581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.55598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.55615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.55645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.55655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.55810: variable '__network_packages_default_gobject_packages' from source: role '' defaults 46400 1727204637.55890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.55907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.55923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.55951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.55965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.56025: variable 'ansible_python' from source: facts 46400 1727204637.56038: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 46400 1727204637.56099: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204637.56152: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204637.56236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.56252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.56275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.56302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.56312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.56344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204637.56367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204637.56386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.56413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204637.56423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204637.56519: variable 'network_connections' from source: include params 46400 1727204637.56526: variable 'interface' from source: play vars 46400 1727204637.56581: variable 'interface' from source: play vars 46400 1727204637.56654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204637.56788: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204637.56824: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204637.56857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204637.56888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204637.56931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204637.56955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204637.56980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204637.57003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204637.57040: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.57222: variable 'network_connections' from source: include params 46400 1727204637.57228: variable 'interface' from source: play vars 46400 1727204637.57286: variable 'interface' from source: play vars 46400 1727204637.57309: variable '__network_packages_default_wireless' from source: role '' defaults 46400 1727204637.57367: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204637.57551: variable 'network_connections' from source: include params 46400 1727204637.57555: variable 'interface' from source: play vars 46400 1727204637.57610: variable 'interface' from source: play vars 46400 1727204637.57626: variable '__network_packages_default_team' from source: role '' defaults 46400 1727204637.57682: variable '__network_team_connections_defined' from source: role '' defaults 46400 1727204637.57872: variable 'network_connections' from source: include params 46400 1727204637.57875: variable 'interface' from source: play vars 46400 1727204637.57925: variable 'interface' from source: play vars 46400 1727204637.57968: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204637.58011: variable '__network_service_name_default_initscripts' from source: role '' defaults 46400 1727204637.58021: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204637.58070: variable '__network_packages_default_initscripts' from source: role '' defaults 46400 1727204637.58210: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 46400 1727204637.58542: variable 'network_connections' from source: include params 46400 1727204637.58545: variable 'interface' from source: play vars 46400 1727204637.58595: variable 'interface' from source: play vars 46400 1727204637.58601: variable 'ansible_distribution' from source: facts 46400 1727204637.58604: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.58609: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.58620: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 46400 1727204637.58736: variable 'ansible_distribution' from source: facts 46400 1727204637.58739: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.58744: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.58754: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 46400 1727204637.58879: variable 'ansible_distribution' from source: facts 46400 1727204637.58883: variable '__network_rh_distros' from source: role '' defaults 46400 1727204637.58885: variable 'ansible_distribution_major_version' from source: facts 46400 1727204637.58911: variable 'network_provider' from source: set_fact 46400 1727204637.58928: variable 'omit' from source: magic vars 46400 1727204637.58950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204637.58975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204637.58991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204637.59005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204637.59014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204637.59037: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204637.59040: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.59044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.59112: Set connection var ansible_shell_type to sh 46400 1727204637.59122: Set connection var ansible_shell_executable to /bin/sh 46400 1727204637.59127: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204637.59132: Set connection var ansible_connection to ssh 46400 1727204637.59137: Set connection var ansible_pipelining to False 46400 1727204637.59142: Set connection var ansible_timeout to 10 46400 1727204637.59165: variable 'ansible_shell_executable' from source: unknown 46400 1727204637.59168: variable 'ansible_connection' from source: unknown 46400 1727204637.59170: variable 'ansible_module_compression' from source: unknown 46400 1727204637.59173: variable 'ansible_shell_type' from source: unknown 46400 1727204637.59175: variable 'ansible_shell_executable' from source: unknown 46400 1727204637.59179: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204637.59183: variable 'ansible_pipelining' from source: unknown 46400 1727204637.59185: variable 'ansible_timeout' from source: unknown 46400 1727204637.59189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204637.59266: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204637.59277: variable 'omit' from source: magic vars 46400 1727204637.59282: starting attempt loop 46400 1727204637.59285: running the handler 46400 1727204637.59343: variable 'ansible_facts' from source: unknown 46400 1727204637.59847: _low_level_execute_command(): starting 46400 1727204637.59855: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204637.60370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204637.60386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204637.60398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.60420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.60467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204637.60480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204637.60538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204637.62211: stdout chunk (state=3): >>>/root <<< 46400 1727204637.62312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204637.62373: stderr chunk (state=3): >>><<< 46400 1727204637.62377: stdout chunk (state=3): >>><<< 46400 1727204637.62395: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204637.62405: _low_level_execute_command(): starting 46400 1727204637.62410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582 `" && echo ansible-tmp-1727204637.623948-55424-163907260769582="` echo /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582 `" ) && sleep 0' 46400 1727204637.62877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204637.62898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204637.62910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.62921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.62975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204637.62990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204637.63027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204637.64886: stdout chunk (state=3): >>>ansible-tmp-1727204637.623948-55424-163907260769582=/root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582 <<< 46400 1727204637.65000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204637.65055: stderr chunk (state=3): >>><<< 46400 1727204637.65058: stdout chunk (state=3): >>><<< 46400 1727204637.65073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204637.623948-55424-163907260769582=/root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204637.65099: variable 'ansible_module_compression' from source: unknown 46400 1727204637.65144: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 46400 1727204637.65198: variable 'ansible_facts' from source: unknown 46400 1727204637.65330: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/AnsiballZ_systemd.py 46400 1727204637.65442: Sending initial data 46400 1727204637.65452: Sent initial data (155 bytes) 46400 1727204637.66127: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204637.66145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204637.66148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204637.66188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204637.66195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204637.66198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204637.66200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.66255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204637.66258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204637.66303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204637.68044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204637.68080: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204637.68118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpgtz7wy91 /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/AnsiballZ_systemd.py <<< 46400 1727204637.68153: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204637.69832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204637.69937: stderr chunk (state=3): >>><<< 46400 1727204637.69940: stdout chunk (state=3): >>><<< 46400 1727204637.69955: done transferring module to remote 46400 1727204637.69968: _low_level_execute_command(): starting 46400 1727204637.69973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/ /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/AnsiballZ_systemd.py && sleep 0' 46400 1727204637.70419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204637.70439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204637.70458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.70474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.70515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204637.70526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204637.70578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204637.72308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204637.72363: stderr chunk (state=3): >>><<< 46400 1727204637.72368: stdout chunk (state=3): >>><<< 46400 1727204637.72378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204637.72381: _low_level_execute_command(): starting 46400 1727204637.72385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/AnsiballZ_systemd.py && sleep 0' 46400 1727204637.72825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204637.72839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204637.72855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204637.72872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204637.72921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204637.72941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204637.72981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204637.98373: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6930432", "MemoryAvailable": "infinity", "CPUUsageNSec": "2335518000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdog<<< 46400 1727204637.98384: stdout chunk (state=3): >>>Signal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 46400 1727204638.00074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204638.00078: stdout chunk (state=3): >>><<< 46400 1727204638.00081: stderr chunk (state=3): >>><<< 46400 1727204638.00398: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6930432", "MemoryAvailable": "infinity", "CPUUsageNSec": "2335518000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204638.00408: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204638.00412: _low_level_execute_command(): starting 46400 1727204638.00414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204637.623948-55424-163907260769582/ > /dev/null 2>&1 && sleep 0' 46400 1727204638.00994: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.01010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.01026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.01045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.01095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.01110: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.01125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.01143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.01156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.01173: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.01185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.01198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.01218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.01230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.01243: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204638.01256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.01338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.01355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.01375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.01455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.03368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.03373: stdout chunk (state=3): >>><<< 46400 1727204638.03384: stderr chunk (state=3): >>><<< 46400 1727204638.03770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204638.03774: handler run complete 46400 1727204638.03777: attempt loop complete, returning result 46400 1727204638.03779: _execute() done 46400 1727204638.03781: dumping result to json 46400 1727204638.03782: done dumping result, returning 46400 1727204638.03784: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1303-fda8-0000000026a0] 46400 1727204638.03786: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a0 46400 1727204638.03936: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a0 46400 1727204638.03939: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204638.04010: no more pending results, returning what we have 46400 1727204638.04014: results queue empty 46400 1727204638.04015: checking for any_errors_fatal 46400 1727204638.04022: done checking for any_errors_fatal 46400 1727204638.04023: checking for max_fail_percentage 46400 1727204638.04025: done checking for max_fail_percentage 46400 1727204638.04026: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.04027: done checking to see if all hosts have failed 46400 1727204638.04028: getting the remaining hosts for this loop 46400 1727204638.04029: done getting the remaining hosts for this loop 46400 1727204638.04033: getting the next task for host managed-node2 46400 1727204638.04042: done getting next task for host managed-node2 46400 1727204638.04046: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204638.04054: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.04071: getting variables 46400 1727204638.04073: in VariableManager get_vars() 46400 1727204638.04117: Calling all_inventory to load vars for managed-node2 46400 1727204638.04120: Calling groups_inventory to load vars for managed-node2 46400 1727204638.04122: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.04133: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.04136: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.04139: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.05948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.07671: done with get_vars() 46400 1727204638.07702: done getting variables 46400 1727204638.07770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.557) 0:02:08.362 ***** 46400 1727204638.07808: entering _queue_task() for managed-node2/service 46400 1727204638.08156: worker is 1 (out of 1 available) 46400 1727204638.08170: exiting _queue_task() for managed-node2/service 46400 1727204638.08187: done queuing things up, now waiting for results queue to drain 46400 1727204638.08189: waiting for pending results... 46400 1727204638.08487: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 46400 1727204638.08652: in run() - task 0affcd87-79f5-1303-fda8-0000000026a1 46400 1727204638.08673: variable 'ansible_search_path' from source: unknown 46400 1727204638.08681: variable 'ansible_search_path' from source: unknown 46400 1727204638.08717: calling self._execute() 46400 1727204638.08827: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.08845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.08859: variable 'omit' from source: magic vars 46400 1727204638.09254: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.09275: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.09409: variable 'network_provider' from source: set_fact 46400 1727204638.09419: Evaluated conditional (network_provider == "nm"): True 46400 1727204638.09521: variable '__network_wpa_supplicant_required' from source: role '' defaults 46400 1727204638.09622: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 46400 1727204638.09822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204638.12595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204638.12681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204638.12724: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204638.12774: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204638.12806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204638.12897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204638.12931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204638.12962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204638.13014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204638.13034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204638.13092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204638.13122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204638.13152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204638.13204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204638.13223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204638.13271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204638.13305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204638.13336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204638.13382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204638.13403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204638.13570: variable 'network_connections' from source: include params 46400 1727204638.13589: variable 'interface' from source: play vars 46400 1727204638.13671: variable 'interface' from source: play vars 46400 1727204638.13775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 46400 1727204638.13949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 46400 1727204638.13998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 46400 1727204638.14036: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 46400 1727204638.14078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 46400 1727204638.14124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 46400 1727204638.14152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 46400 1727204638.14191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204638.14222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 46400 1727204638.14278: variable '__network_wireless_connections_defined' from source: role '' defaults 46400 1727204638.14553: variable 'network_connections' from source: include params 46400 1727204638.14567: variable 'interface' from source: play vars 46400 1727204638.14638: variable 'interface' from source: play vars 46400 1727204638.14679: Evaluated conditional (__network_wpa_supplicant_required): False 46400 1727204638.14688: when evaluation is False, skipping this task 46400 1727204638.14695: _execute() done 46400 1727204638.14703: dumping result to json 46400 1727204638.14711: done dumping result, returning 46400 1727204638.14730: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1303-fda8-0000000026a1] 46400 1727204638.14750: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a1 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 46400 1727204638.14907: no more pending results, returning what we have 46400 1727204638.14911: results queue empty 46400 1727204638.14913: checking for any_errors_fatal 46400 1727204638.14935: done checking for any_errors_fatal 46400 1727204638.14936: checking for max_fail_percentage 46400 1727204638.14938: done checking for max_fail_percentage 46400 1727204638.14939: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.14940: done checking to see if all hosts have failed 46400 1727204638.14941: getting the remaining hosts for this loop 46400 1727204638.14943: done getting the remaining hosts for this loop 46400 1727204638.14947: getting the next task for host managed-node2 46400 1727204638.14956: done getting next task for host managed-node2 46400 1727204638.14960: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204638.14967: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.14999: getting variables 46400 1727204638.15001: in VariableManager get_vars() 46400 1727204638.15051: Calling all_inventory to load vars for managed-node2 46400 1727204638.15054: Calling groups_inventory to load vars for managed-node2 46400 1727204638.15057: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.15069: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.15072: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.15076: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.16058: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a1 46400 1727204638.16062: WORKER PROCESS EXITING 46400 1727204638.17023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.17951: done with get_vars() 46400 1727204638.17974: done getting variables 46400 1727204638.18019: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.102) 0:02:08.465 ***** 46400 1727204638.18045: entering _queue_task() for managed-node2/service 46400 1727204638.18293: worker is 1 (out of 1 available) 46400 1727204638.18307: exiting _queue_task() for managed-node2/service 46400 1727204638.18320: done queuing things up, now waiting for results queue to drain 46400 1727204638.18322: waiting for pending results... 46400 1727204638.18587: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 46400 1727204638.18770: in run() - task 0affcd87-79f5-1303-fda8-0000000026a2 46400 1727204638.18793: variable 'ansible_search_path' from source: unknown 46400 1727204638.18800: variable 'ansible_search_path' from source: unknown 46400 1727204638.18841: calling self._execute() 46400 1727204638.18954: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.18976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.18992: variable 'omit' from source: magic vars 46400 1727204638.19394: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.19417: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.19551: variable 'network_provider' from source: set_fact 46400 1727204638.19567: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204638.19575: when evaluation is False, skipping this task 46400 1727204638.19582: _execute() done 46400 1727204638.19590: dumping result to json 46400 1727204638.19598: done dumping result, returning 46400 1727204638.19610: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1303-fda8-0000000026a2] 46400 1727204638.19631: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a2 46400 1727204638.19752: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a2 46400 1727204638.19761: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 46400 1727204638.19810: no more pending results, returning what we have 46400 1727204638.19815: results queue empty 46400 1727204638.19816: checking for any_errors_fatal 46400 1727204638.19826: done checking for any_errors_fatal 46400 1727204638.19826: checking for max_fail_percentage 46400 1727204638.19828: done checking for max_fail_percentage 46400 1727204638.19829: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.19830: done checking to see if all hosts have failed 46400 1727204638.19831: getting the remaining hosts for this loop 46400 1727204638.19833: done getting the remaining hosts for this loop 46400 1727204638.19836: getting the next task for host managed-node2 46400 1727204638.19845: done getting next task for host managed-node2 46400 1727204638.19849: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204638.19855: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.19893: getting variables 46400 1727204638.19895: in VariableManager get_vars() 46400 1727204638.19947: Calling all_inventory to load vars for managed-node2 46400 1727204638.19950: Calling groups_inventory to load vars for managed-node2 46400 1727204638.19952: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.19968: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.19971: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.19974: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.20898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.21954: done with get_vars() 46400 1727204638.21973: done getting variables 46400 1727204638.22015: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.039) 0:02:08.505 ***** 46400 1727204638.22044: entering _queue_task() for managed-node2/copy 46400 1727204638.22313: worker is 1 (out of 1 available) 46400 1727204638.22326: exiting _queue_task() for managed-node2/copy 46400 1727204638.22338: done queuing things up, now waiting for results queue to drain 46400 1727204638.22340: waiting for pending results... 46400 1727204638.22635: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 46400 1727204638.22799: in run() - task 0affcd87-79f5-1303-fda8-0000000026a3 46400 1727204638.22817: variable 'ansible_search_path' from source: unknown 46400 1727204638.22824: variable 'ansible_search_path' from source: unknown 46400 1727204638.22863: calling self._execute() 46400 1727204638.22969: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.22981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.23000: variable 'omit' from source: magic vars 46400 1727204638.23387: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.23405: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.23530: variable 'network_provider' from source: set_fact 46400 1727204638.23546: Evaluated conditional (network_provider == "initscripts"): False 46400 1727204638.23554: when evaluation is False, skipping this task 46400 1727204638.23561: _execute() done 46400 1727204638.23571: dumping result to json 46400 1727204638.23579: done dumping result, returning 46400 1727204638.23589: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1303-fda8-0000000026a3] 46400 1727204638.23600: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a3 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 46400 1727204638.23754: no more pending results, returning what we have 46400 1727204638.23758: results queue empty 46400 1727204638.23760: checking for any_errors_fatal 46400 1727204638.23768: done checking for any_errors_fatal 46400 1727204638.23769: checking for max_fail_percentage 46400 1727204638.23771: done checking for max_fail_percentage 46400 1727204638.23772: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.23773: done checking to see if all hosts have failed 46400 1727204638.23774: getting the remaining hosts for this loop 46400 1727204638.23777: done getting the remaining hosts for this loop 46400 1727204638.23781: getting the next task for host managed-node2 46400 1727204638.23791: done getting next task for host managed-node2 46400 1727204638.23796: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204638.23802: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.23838: getting variables 46400 1727204638.23840: in VariableManager get_vars() 46400 1727204638.23894: Calling all_inventory to load vars for managed-node2 46400 1727204638.23898: Calling groups_inventory to load vars for managed-node2 46400 1727204638.23901: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.23914: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.23917: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.23921: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.24883: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a3 46400 1727204638.24887: WORKER PROCESS EXITING 46400 1727204638.25669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.27341: done with get_vars() 46400 1727204638.27372: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.054) 0:02:08.559 ***** 46400 1727204638.27465: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204638.27808: worker is 1 (out of 1 available) 46400 1727204638.27820: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 46400 1727204638.27833: done queuing things up, now waiting for results queue to drain 46400 1727204638.27835: waiting for pending results... 46400 1727204638.28140: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 46400 1727204638.28307: in run() - task 0affcd87-79f5-1303-fda8-0000000026a4 46400 1727204638.28330: variable 'ansible_search_path' from source: unknown 46400 1727204638.28338: variable 'ansible_search_path' from source: unknown 46400 1727204638.28384: calling self._execute() 46400 1727204638.28491: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.28508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.28521: variable 'omit' from source: magic vars 46400 1727204638.28921: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.28944: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.28957: variable 'omit' from source: magic vars 46400 1727204638.29032: variable 'omit' from source: magic vars 46400 1727204638.29206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204638.31521: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204638.31602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204638.31651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204638.31695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204638.31727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204638.31817: variable 'network_provider' from source: set_fact 46400 1727204638.31958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204638.31997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204638.32028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204638.32079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204638.32101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204638.32178: variable 'omit' from source: magic vars 46400 1727204638.32294: variable 'omit' from source: magic vars 46400 1727204638.32404: variable 'network_connections' from source: include params 46400 1727204638.32423: variable 'interface' from source: play vars 46400 1727204638.32488: variable 'interface' from source: play vars 46400 1727204638.32644: variable 'omit' from source: magic vars 46400 1727204638.32657: variable '__lsr_ansible_managed' from source: task vars 46400 1727204638.32720: variable '__lsr_ansible_managed' from source: task vars 46400 1727204638.33301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 46400 1727204638.33515: Loaded config def from plugin (lookup/template) 46400 1727204638.33526: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 46400 1727204638.33558: File lookup term: get_ansible_managed.j2 46400 1727204638.33569: variable 'ansible_search_path' from source: unknown 46400 1727204638.33580: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 46400 1727204638.33601: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 46400 1727204638.33625: variable 'ansible_search_path' from source: unknown 46400 1727204638.39881: variable 'ansible_managed' from source: unknown 46400 1727204638.40039: variable 'omit' from source: magic vars 46400 1727204638.40077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204638.40111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204638.40135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204638.40158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.40176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.40206: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204638.40219: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.40228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.40319: Set connection var ansible_shell_type to sh 46400 1727204638.40337: Set connection var ansible_shell_executable to /bin/sh 46400 1727204638.40349: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204638.40359: Set connection var ansible_connection to ssh 46400 1727204638.40371: Set connection var ansible_pipelining to False 46400 1727204638.40382: Set connection var ansible_timeout to 10 46400 1727204638.40414: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.40422: variable 'ansible_connection' from source: unknown 46400 1727204638.40430: variable 'ansible_module_compression' from source: unknown 46400 1727204638.40440: variable 'ansible_shell_type' from source: unknown 46400 1727204638.40448: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.40455: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.40462: variable 'ansible_pipelining' from source: unknown 46400 1727204638.40472: variable 'ansible_timeout' from source: unknown 46400 1727204638.40480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.40617: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204638.40642: variable 'omit' from source: magic vars 46400 1727204638.40656: starting attempt loop 46400 1727204638.40663: running the handler 46400 1727204638.40683: _low_level_execute_command(): starting 46400 1727204638.40693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204638.41437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.41453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.41474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.41494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.41538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.41554: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.41570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.41589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.41601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.41612: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.41624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.41639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.41659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.41675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.41687: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204638.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.41781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.41804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.41818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.41904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.43575: stdout chunk (state=3): >>>/root <<< 46400 1727204638.43772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.43775: stdout chunk (state=3): >>><<< 46400 1727204638.43778: stderr chunk (state=3): >>><<< 46400 1727204638.43893: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204638.43896: _low_level_execute_command(): starting 46400 1727204638.43899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231 `" && echo ansible-tmp-1727204638.438039-55448-264642025951231="` echo /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231 `" ) && sleep 0' 46400 1727204638.44524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.44538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.44555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.44587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.44630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.44643: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.44661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.44688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.44700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.44711: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.44723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.44736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.44753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.44769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.44788: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204638.44803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.44881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.44912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.44929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.45006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.46895: stdout chunk (state=3): >>>ansible-tmp-1727204638.438039-55448-264642025951231=/root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231 <<< 46400 1727204638.47014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.47104: stderr chunk (state=3): >>><<< 46400 1727204638.47119: stdout chunk (state=3): >>><<< 46400 1727204638.47375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204638.438039-55448-264642025951231=/root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204638.47384: variable 'ansible_module_compression' from source: unknown 46400 1727204638.47387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 46400 1727204638.47389: variable 'ansible_facts' from source: unknown 46400 1727204638.47449: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/AnsiballZ_network_connections.py 46400 1727204638.47621: Sending initial data 46400 1727204638.47624: Sent initial data (167 bytes) 46400 1727204638.48657: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.48676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.48698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.48716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.48760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.48774: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.48789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.48816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.48829: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.48843: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.48856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.48873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.48890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.48910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.48926: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204638.48941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.49026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.49052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.49072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.49154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.50867: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204638.50911: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204638.50946: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp2vs14rdy /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/AnsiballZ_network_connections.py <<< 46400 1727204638.50979: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204638.52435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.52675: stderr chunk (state=3): >>><<< 46400 1727204638.52678: stdout chunk (state=3): >>><<< 46400 1727204638.52681: done transferring module to remote 46400 1727204638.52683: _low_level_execute_command(): starting 46400 1727204638.52685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/ /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/AnsiballZ_network_connections.py && sleep 0' 46400 1727204638.53301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.53314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.53329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.53347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.53389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.53405: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.53424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.53443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.53455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.53471: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.53486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.53507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.53527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.53582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.53585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.53599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.53649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.55389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.55454: stderr chunk (state=3): >>><<< 46400 1727204638.55460: stdout chunk (state=3): >>><<< 46400 1727204638.55499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204638.55502: _low_level_execute_command(): starting 46400 1727204638.55509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/AnsiballZ_network_connections.py && sleep 0' 46400 1727204638.56195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204638.56208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.56223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.56251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.56299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.56311: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204638.56323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.56344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204638.56363: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204638.56378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204638.56390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.56408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.56422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.56433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204638.56444: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204638.56463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.56548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.56579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204638.56603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.56691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.79739: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 46400 1727204638.81178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204638.81237: stderr chunk (state=3): >>><<< 46400 1727204638.81243: stdout chunk (state=3): >>><<< 46400 1727204638.81258: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204638.81289: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204638.81297: _low_level_execute_command(): starting 46400 1727204638.81302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204638.438039-55448-264642025951231/ > /dev/null 2>&1 && sleep 0' 46400 1727204638.81781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204638.81784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204638.81815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.81818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204638.81821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204638.81878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204638.81881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204638.81926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204638.83720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204638.83773: stderr chunk (state=3): >>><<< 46400 1727204638.83777: stdout chunk (state=3): >>><<< 46400 1727204638.83791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204638.83797: handler run complete 46400 1727204638.83820: attempt loop complete, returning result 46400 1727204638.83823: _execute() done 46400 1727204638.83826: dumping result to json 46400 1727204638.83831: done dumping result, returning 46400 1727204638.83840: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1303-fda8-0000000026a4] 46400 1727204638.83845: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a4 46400 1727204638.83950: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a4 46400 1727204638.83953: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 46400 1727204638.84057: no more pending results, returning what we have 46400 1727204638.84063: results queue empty 46400 1727204638.84064: checking for any_errors_fatal 46400 1727204638.84070: done checking for any_errors_fatal 46400 1727204638.84071: checking for max_fail_percentage 46400 1727204638.84074: done checking for max_fail_percentage 46400 1727204638.84075: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.84076: done checking to see if all hosts have failed 46400 1727204638.84077: getting the remaining hosts for this loop 46400 1727204638.84079: done getting the remaining hosts for this loop 46400 1727204638.84082: getting the next task for host managed-node2 46400 1727204638.84089: done getting next task for host managed-node2 46400 1727204638.84094: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204638.84098: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.84112: getting variables 46400 1727204638.84113: in VariableManager get_vars() 46400 1727204638.84157: Calling all_inventory to load vars for managed-node2 46400 1727204638.84160: Calling groups_inventory to load vars for managed-node2 46400 1727204638.84162: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.84180: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.84182: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.84185: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.85205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.86117: done with get_vars() 46400 1727204638.86136: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.587) 0:02:09.146 ***** 46400 1727204638.86203: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204638.86450: worker is 1 (out of 1 available) 46400 1727204638.86461: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 46400 1727204638.86477: done queuing things up, now waiting for results queue to drain 46400 1727204638.86479: waiting for pending results... 46400 1727204638.86678: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 46400 1727204638.86790: in run() - task 0affcd87-79f5-1303-fda8-0000000026a5 46400 1727204638.86802: variable 'ansible_search_path' from source: unknown 46400 1727204638.86807: variable 'ansible_search_path' from source: unknown 46400 1727204638.86835: calling self._execute() 46400 1727204638.86921: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.86924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.86933: variable 'omit' from source: magic vars 46400 1727204638.87215: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.87224: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.87315: variable 'network_state' from source: role '' defaults 46400 1727204638.87324: Evaluated conditional (network_state != {}): False 46400 1727204638.87327: when evaluation is False, skipping this task 46400 1727204638.87330: _execute() done 46400 1727204638.87332: dumping result to json 46400 1727204638.87336: done dumping result, returning 46400 1727204638.87342: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1303-fda8-0000000026a5] 46400 1727204638.87352: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a5 46400 1727204638.87444: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a5 46400 1727204638.87448: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 46400 1727204638.87524: no more pending results, returning what we have 46400 1727204638.87528: results queue empty 46400 1727204638.87529: checking for any_errors_fatal 46400 1727204638.87539: done checking for any_errors_fatal 46400 1727204638.87540: checking for max_fail_percentage 46400 1727204638.87541: done checking for max_fail_percentage 46400 1727204638.87542: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.87543: done checking to see if all hosts have failed 46400 1727204638.87544: getting the remaining hosts for this loop 46400 1727204638.87545: done getting the remaining hosts for this loop 46400 1727204638.87549: getting the next task for host managed-node2 46400 1727204638.87563: done getting next task for host managed-node2 46400 1727204638.87569: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204638.87574: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.87598: getting variables 46400 1727204638.87600: in VariableManager get_vars() 46400 1727204638.87637: Calling all_inventory to load vars for managed-node2 46400 1727204638.87641: Calling groups_inventory to load vars for managed-node2 46400 1727204638.87643: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.87652: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.87654: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.87657: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.88469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.89405: done with get_vars() 46400 1727204638.89423: done getting variables 46400 1727204638.89469: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.032) 0:02:09.179 ***** 46400 1727204638.89495: entering _queue_task() for managed-node2/debug 46400 1727204638.89739: worker is 1 (out of 1 available) 46400 1727204638.89753: exiting _queue_task() for managed-node2/debug 46400 1727204638.89768: done queuing things up, now waiting for results queue to drain 46400 1727204638.89770: waiting for pending results... 46400 1727204638.89965: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 46400 1727204638.90084: in run() - task 0affcd87-79f5-1303-fda8-0000000026a6 46400 1727204638.90098: variable 'ansible_search_path' from source: unknown 46400 1727204638.90102: variable 'ansible_search_path' from source: unknown 46400 1727204638.90132: calling self._execute() 46400 1727204638.90217: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.90227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.90235: variable 'omit' from source: magic vars 46400 1727204638.90520: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.90530: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.90536: variable 'omit' from source: magic vars 46400 1727204638.90584: variable 'omit' from source: magic vars 46400 1727204638.90611: variable 'omit' from source: magic vars 46400 1727204638.90647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204638.90676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204638.90693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204638.90709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.90720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.90742: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204638.90746: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.90748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.90817: Set connection var ansible_shell_type to sh 46400 1727204638.90826: Set connection var ansible_shell_executable to /bin/sh 46400 1727204638.90831: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204638.90836: Set connection var ansible_connection to ssh 46400 1727204638.90842: Set connection var ansible_pipelining to False 46400 1727204638.90847: Set connection var ansible_timeout to 10 46400 1727204638.90869: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.90874: variable 'ansible_connection' from source: unknown 46400 1727204638.90878: variable 'ansible_module_compression' from source: unknown 46400 1727204638.90880: variable 'ansible_shell_type' from source: unknown 46400 1727204638.90882: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.90884: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.90889: variable 'ansible_pipelining' from source: unknown 46400 1727204638.90891: variable 'ansible_timeout' from source: unknown 46400 1727204638.90895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.91005: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204638.91015: variable 'omit' from source: magic vars 46400 1727204638.91019: starting attempt loop 46400 1727204638.91023: running the handler 46400 1727204638.91116: variable '__network_connections_result' from source: set_fact 46400 1727204638.91158: handler run complete 46400 1727204638.91176: attempt loop complete, returning result 46400 1727204638.91179: _execute() done 46400 1727204638.91182: dumping result to json 46400 1727204638.91185: done dumping result, returning 46400 1727204638.91192: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1303-fda8-0000000026a6] 46400 1727204638.91197: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a6 46400 1727204638.91287: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a6 46400 1727204638.91290: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 46400 1727204638.91381: no more pending results, returning what we have 46400 1727204638.91385: results queue empty 46400 1727204638.91386: checking for any_errors_fatal 46400 1727204638.91391: done checking for any_errors_fatal 46400 1727204638.91392: checking for max_fail_percentage 46400 1727204638.91393: done checking for max_fail_percentage 46400 1727204638.91394: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.91395: done checking to see if all hosts have failed 46400 1727204638.91396: getting the remaining hosts for this loop 46400 1727204638.91397: done getting the remaining hosts for this loop 46400 1727204638.91400: getting the next task for host managed-node2 46400 1727204638.91407: done getting next task for host managed-node2 46400 1727204638.91412: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204638.91416: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.91428: getting variables 46400 1727204638.91430: in VariableManager get_vars() 46400 1727204638.91476: Calling all_inventory to load vars for managed-node2 46400 1727204638.91479: Calling groups_inventory to load vars for managed-node2 46400 1727204638.91481: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.91490: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.91492: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.91495: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.92439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.93355: done with get_vars() 46400 1727204638.93374: done getting variables 46400 1727204638.93419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.039) 0:02:09.219 ***** 46400 1727204638.93449: entering _queue_task() for managed-node2/debug 46400 1727204638.93684: worker is 1 (out of 1 available) 46400 1727204638.93699: exiting _queue_task() for managed-node2/debug 46400 1727204638.93713: done queuing things up, now waiting for results queue to drain 46400 1727204638.93714: waiting for pending results... 46400 1727204638.93909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 46400 1727204638.94015: in run() - task 0affcd87-79f5-1303-fda8-0000000026a7 46400 1727204638.94028: variable 'ansible_search_path' from source: unknown 46400 1727204638.94032: variable 'ansible_search_path' from source: unknown 46400 1727204638.94067: calling self._execute() 46400 1727204638.94152: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.94159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.94170: variable 'omit' from source: magic vars 46400 1727204638.94457: variable 'ansible_distribution_major_version' from source: facts 46400 1727204638.94471: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204638.94476: variable 'omit' from source: magic vars 46400 1727204638.94523: variable 'omit' from source: magic vars 46400 1727204638.94545: variable 'omit' from source: magic vars 46400 1727204638.94584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204638.94615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204638.94634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204638.94647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.94657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204638.94685: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204638.94688: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.94691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.94758: Set connection var ansible_shell_type to sh 46400 1727204638.94770: Set connection var ansible_shell_executable to /bin/sh 46400 1727204638.94775: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204638.94780: Set connection var ansible_connection to ssh 46400 1727204638.94785: Set connection var ansible_pipelining to False 46400 1727204638.94790: Set connection var ansible_timeout to 10 46400 1727204638.94810: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.94813: variable 'ansible_connection' from source: unknown 46400 1727204638.94816: variable 'ansible_module_compression' from source: unknown 46400 1727204638.94819: variable 'ansible_shell_type' from source: unknown 46400 1727204638.94821: variable 'ansible_shell_executable' from source: unknown 46400 1727204638.94824: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.94826: variable 'ansible_pipelining' from source: unknown 46400 1727204638.94828: variable 'ansible_timeout' from source: unknown 46400 1727204638.94830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.94933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204638.94943: variable 'omit' from source: magic vars 46400 1727204638.94946: starting attempt loop 46400 1727204638.94949: running the handler 46400 1727204638.94991: variable '__network_connections_result' from source: set_fact 46400 1727204638.95045: variable '__network_connections_result' from source: set_fact 46400 1727204638.95146: handler run complete 46400 1727204638.95182: attempt loop complete, returning result 46400 1727204638.95189: _execute() done 46400 1727204638.95196: dumping result to json 46400 1727204638.95204: done dumping result, returning 46400 1727204638.95214: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1303-fda8-0000000026a7] 46400 1727204638.95223: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a7 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 46400 1727204638.95439: no more pending results, returning what we have 46400 1727204638.95443: results queue empty 46400 1727204638.95445: checking for any_errors_fatal 46400 1727204638.95454: done checking for any_errors_fatal 46400 1727204638.95455: checking for max_fail_percentage 46400 1727204638.95457: done checking for max_fail_percentage 46400 1727204638.95458: checking to see if all hosts have failed and the running result is not ok 46400 1727204638.95459: done checking to see if all hosts have failed 46400 1727204638.95462: getting the remaining hosts for this loop 46400 1727204638.95466: done getting the remaining hosts for this loop 46400 1727204638.95470: getting the next task for host managed-node2 46400 1727204638.95481: done getting next task for host managed-node2 46400 1727204638.95485: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204638.95491: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204638.95508: getting variables 46400 1727204638.95510: in VariableManager get_vars() 46400 1727204638.95562: Calling all_inventory to load vars for managed-node2 46400 1727204638.95567: Calling groups_inventory to load vars for managed-node2 46400 1727204638.95570: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204638.95581: Calling all_plugins_play to load vars for managed-node2 46400 1727204638.95590: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204638.95594: Calling groups_plugins_play to load vars for managed-node2 46400 1727204638.96483: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a7 46400 1727204638.96487: WORKER PROCESS EXITING 46400 1727204638.97417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204638.98906: done with get_vars() 46400 1727204638.98924: done getting variables 46400 1727204638.98975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.055) 0:02:09.274 ***** 46400 1727204638.99003: entering _queue_task() for managed-node2/debug 46400 1727204638.99248: worker is 1 (out of 1 available) 46400 1727204638.99268: exiting _queue_task() for managed-node2/debug 46400 1727204638.99281: done queuing things up, now waiting for results queue to drain 46400 1727204638.99283: waiting for pending results... 46400 1727204638.99476: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 46400 1727204638.99575: in run() - task 0affcd87-79f5-1303-fda8-0000000026a8 46400 1727204638.99589: variable 'ansible_search_path' from source: unknown 46400 1727204638.99592: variable 'ansible_search_path' from source: unknown 46400 1727204638.99622: calling self._execute() 46400 1727204638.99713: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204638.99717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204638.99727: variable 'omit' from source: magic vars 46400 1727204639.00008: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.00017: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.00108: variable 'network_state' from source: role '' defaults 46400 1727204639.00116: Evaluated conditional (network_state != {}): False 46400 1727204639.00119: when evaluation is False, skipping this task 46400 1727204639.00122: _execute() done 46400 1727204639.00124: dumping result to json 46400 1727204639.00126: done dumping result, returning 46400 1727204639.00133: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1303-fda8-0000000026a8] 46400 1727204639.00144: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a8 46400 1727204639.00229: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a8 46400 1727204639.00233: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 46400 1727204639.00304: no more pending results, returning what we have 46400 1727204639.00308: results queue empty 46400 1727204639.00309: checking for any_errors_fatal 46400 1727204639.00320: done checking for any_errors_fatal 46400 1727204639.00321: checking for max_fail_percentage 46400 1727204639.00324: done checking for max_fail_percentage 46400 1727204639.00325: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.00326: done checking to see if all hosts have failed 46400 1727204639.00326: getting the remaining hosts for this loop 46400 1727204639.00328: done getting the remaining hosts for this loop 46400 1727204639.00332: getting the next task for host managed-node2 46400 1727204639.00340: done getting next task for host managed-node2 46400 1727204639.00344: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204639.00348: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.00380: getting variables 46400 1727204639.00382: in VariableManager get_vars() 46400 1727204639.00420: Calling all_inventory to load vars for managed-node2 46400 1727204639.00423: Calling groups_inventory to load vars for managed-node2 46400 1727204639.00425: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.00434: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.00437: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.00440: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.01218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.02126: done with get_vars() 46400 1727204639.02142: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.032) 0:02:09.306 ***** 46400 1727204639.02215: entering _queue_task() for managed-node2/ping 46400 1727204639.02427: worker is 1 (out of 1 available) 46400 1727204639.02441: exiting _queue_task() for managed-node2/ping 46400 1727204639.02455: done queuing things up, now waiting for results queue to drain 46400 1727204639.02456: waiting for pending results... 46400 1727204639.02642: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 46400 1727204639.02743: in run() - task 0affcd87-79f5-1303-fda8-0000000026a9 46400 1727204639.02756: variable 'ansible_search_path' from source: unknown 46400 1727204639.02760: variable 'ansible_search_path' from source: unknown 46400 1727204639.02792: calling self._execute() 46400 1727204639.02871: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.02883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.02907: variable 'omit' from source: magic vars 46400 1727204639.03269: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.03286: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.03296: variable 'omit' from source: magic vars 46400 1727204639.03373: variable 'omit' from source: magic vars 46400 1727204639.03413: variable 'omit' from source: magic vars 46400 1727204639.03462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204639.03505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204639.03537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204639.03562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.03583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.03617: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204639.03626: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.03641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.03738: Set connection var ansible_shell_type to sh 46400 1727204639.03758: Set connection var ansible_shell_executable to /bin/sh 46400 1727204639.03770: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204639.03781: Set connection var ansible_connection to ssh 46400 1727204639.03791: Set connection var ansible_pipelining to False 46400 1727204639.03800: Set connection var ansible_timeout to 10 46400 1727204639.03827: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.03834: variable 'ansible_connection' from source: unknown 46400 1727204639.03840: variable 'ansible_module_compression' from source: unknown 46400 1727204639.03846: variable 'ansible_shell_type' from source: unknown 46400 1727204639.03856: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.03865: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.03874: variable 'ansible_pipelining' from source: unknown 46400 1727204639.03880: variable 'ansible_timeout' from source: unknown 46400 1727204639.03887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.04099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204639.04116: variable 'omit' from source: magic vars 46400 1727204639.04126: starting attempt loop 46400 1727204639.04134: running the handler 46400 1727204639.04152: _low_level_execute_command(): starting 46400 1727204639.04166: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204639.04823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.04826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.04867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.04871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.04876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.04924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.04928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.04982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.06680: stdout chunk (state=3): >>>/root <<< 46400 1727204639.06791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.06837: stderr chunk (state=3): >>><<< 46400 1727204639.06841: stdout chunk (state=3): >>><<< 46400 1727204639.06861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.06879: _low_level_execute_command(): starting 46400 1727204639.06884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857 `" && echo ansible-tmp-1727204639.0686612-55471-260428175927857="` echo /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857 `" ) && sleep 0' 46400 1727204639.07310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.07326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.07338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204639.07367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.07400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.07443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.07457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.07512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.09476: stdout chunk (state=3): >>>ansible-tmp-1727204639.0686612-55471-260428175927857=/root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857 <<< 46400 1727204639.09593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.09680: stderr chunk (state=3): >>><<< 46400 1727204639.09698: stdout chunk (state=3): >>><<< 46400 1727204639.09722: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.0686612-55471-260428175927857=/root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.09777: variable 'ansible_module_compression' from source: unknown 46400 1727204639.09827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 46400 1727204639.09868: variable 'ansible_facts' from source: unknown 46400 1727204639.09962: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/AnsiballZ_ping.py 46400 1727204639.10125: Sending initial data 46400 1727204639.10128: Sent initial data (153 bytes) 46400 1727204639.11151: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204639.11172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.11188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.11209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.11263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.11280: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204639.11296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.11314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204639.11329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204639.11345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204639.11359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.11380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.11397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.11410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.11423: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204639.11451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.11529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.11553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.11580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.11664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.13533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204639.13571: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204639.13616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp4k5nfwbg /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/AnsiballZ_ping.py <<< 46400 1727204639.13661: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204639.14792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.14974: stderr chunk (state=3): >>><<< 46400 1727204639.14977: stdout chunk (state=3): >>><<< 46400 1727204639.14980: done transferring module to remote 46400 1727204639.14982: _low_level_execute_command(): starting 46400 1727204639.14984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/ /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/AnsiballZ_ping.py && sleep 0' 46400 1727204639.15635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204639.15652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.15676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.15695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.15739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.15756: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204639.15777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.15796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204639.15809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204639.15821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204639.15834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.15851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.15882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.15897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.15909: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204639.15924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.16012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.16034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.16052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.16127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.18022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.18108: stderr chunk (state=3): >>><<< 46400 1727204639.18112: stdout chunk (state=3): >>><<< 46400 1727204639.18173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.18176: _low_level_execute_command(): starting 46400 1727204639.18179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/AnsiballZ_ping.py && sleep 0' 46400 1727204639.18698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.18701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.18738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.18743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.18745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.18795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.18798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.18863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.32494: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 46400 1727204639.33526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204639.33586: stderr chunk (state=3): >>><<< 46400 1727204639.33590: stdout chunk (state=3): >>><<< 46400 1727204639.33607: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204639.33629: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204639.33640: _low_level_execute_command(): starting 46400 1727204639.33643: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.0686612-55471-260428175927857/ > /dev/null 2>&1 && sleep 0' 46400 1727204639.34122: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.34126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.34159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.34167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.34170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.34223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.34227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.34238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.34276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.36068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.36124: stderr chunk (state=3): >>><<< 46400 1727204639.36128: stdout chunk (state=3): >>><<< 46400 1727204639.36143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.36148: handler run complete 46400 1727204639.36164: attempt loop complete, returning result 46400 1727204639.36169: _execute() done 46400 1727204639.36176: dumping result to json 46400 1727204639.36179: done dumping result, returning 46400 1727204639.36188: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1303-fda8-0000000026a9] 46400 1727204639.36193: sending task result for task 0affcd87-79f5-1303-fda8-0000000026a9 46400 1727204639.36288: done sending task result for task 0affcd87-79f5-1303-fda8-0000000026a9 46400 1727204639.36290: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 46400 1727204639.36400: no more pending results, returning what we have 46400 1727204639.36404: results queue empty 46400 1727204639.36405: checking for any_errors_fatal 46400 1727204639.36412: done checking for any_errors_fatal 46400 1727204639.36413: checking for max_fail_percentage 46400 1727204639.36415: done checking for max_fail_percentage 46400 1727204639.36415: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.36416: done checking to see if all hosts have failed 46400 1727204639.36417: getting the remaining hosts for this loop 46400 1727204639.36419: done getting the remaining hosts for this loop 46400 1727204639.36424: getting the next task for host managed-node2 46400 1727204639.36435: done getting next task for host managed-node2 46400 1727204639.36437: ^ task is: TASK: meta (role_complete) 46400 1727204639.36443: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.36456: getting variables 46400 1727204639.36458: in VariableManager get_vars() 46400 1727204639.36507: Calling all_inventory to load vars for managed-node2 46400 1727204639.36510: Calling groups_inventory to load vars for managed-node2 46400 1727204639.36513: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.36522: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.36525: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.36527: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.37513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.38428: done with get_vars() 46400 1727204639.38454: done getting variables 46400 1727204639.38524: done queuing things up, now waiting for results queue to drain 46400 1727204639.38526: results queue empty 46400 1727204639.38526: checking for any_errors_fatal 46400 1727204639.38528: done checking for any_errors_fatal 46400 1727204639.38529: checking for max_fail_percentage 46400 1727204639.38530: done checking for max_fail_percentage 46400 1727204639.38530: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.38531: done checking to see if all hosts have failed 46400 1727204639.38531: getting the remaining hosts for this loop 46400 1727204639.38532: done getting the remaining hosts for this loop 46400 1727204639.38534: getting the next task for host managed-node2 46400 1727204639.38538: done getting next task for host managed-node2 46400 1727204639.38540: ^ task is: TASK: Asserts 46400 1727204639.38541: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.38544: getting variables 46400 1727204639.38545: in VariableManager get_vars() 46400 1727204639.38555: Calling all_inventory to load vars for managed-node2 46400 1727204639.38557: Calling groups_inventory to load vars for managed-node2 46400 1727204639.38558: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.38562: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.38565: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.38567: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.39242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.40148: done with get_vars() 46400 1727204639.40172: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.380) 0:02:09.686 ***** 46400 1727204639.40230: entering _queue_task() for managed-node2/include_tasks 46400 1727204639.40493: worker is 1 (out of 1 available) 46400 1727204639.40506: exiting _queue_task() for managed-node2/include_tasks 46400 1727204639.40519: done queuing things up, now waiting for results queue to drain 46400 1727204639.40521: waiting for pending results... 46400 1727204639.40724: running TaskExecutor() for managed-node2/TASK: Asserts 46400 1727204639.40817: in run() - task 0affcd87-79f5-1303-fda8-0000000020b2 46400 1727204639.40827: variable 'ansible_search_path' from source: unknown 46400 1727204639.40830: variable 'ansible_search_path' from source: unknown 46400 1727204639.40872: variable 'lsr_assert' from source: include params 46400 1727204639.41036: variable 'lsr_assert' from source: include params 46400 1727204639.41098: variable 'omit' from source: magic vars 46400 1727204639.41213: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.41221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.41229: variable 'omit' from source: magic vars 46400 1727204639.41416: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.41424: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.41430: variable 'item' from source: unknown 46400 1727204639.41480: variable 'item' from source: unknown 46400 1727204639.41505: variable 'item' from source: unknown 46400 1727204639.41549: variable 'item' from source: unknown 46400 1727204639.41690: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.41693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.41695: variable 'omit' from source: magic vars 46400 1727204639.41777: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.41780: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.41786: variable 'item' from source: unknown 46400 1727204639.41832: variable 'item' from source: unknown 46400 1727204639.41855: variable 'item' from source: unknown 46400 1727204639.41901: variable 'item' from source: unknown 46400 1727204639.41973: dumping result to json 46400 1727204639.41976: done dumping result, returning 46400 1727204639.41978: done running TaskExecutor() for managed-node2/TASK: Asserts [0affcd87-79f5-1303-fda8-0000000020b2] 46400 1727204639.41980: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b2 46400 1727204639.42012: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b2 46400 1727204639.42014: WORKER PROCESS EXITING 46400 1727204639.42044: no more pending results, returning what we have 46400 1727204639.42049: in VariableManager get_vars() 46400 1727204639.42103: Calling all_inventory to load vars for managed-node2 46400 1727204639.42106: Calling groups_inventory to load vars for managed-node2 46400 1727204639.42109: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.42123: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.42126: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.42128: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.47970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.48874: done with get_vars() 46400 1727204639.48896: variable 'ansible_search_path' from source: unknown 46400 1727204639.48897: variable 'ansible_search_path' from source: unknown 46400 1727204639.48928: variable 'ansible_search_path' from source: unknown 46400 1727204639.48929: variable 'ansible_search_path' from source: unknown 46400 1727204639.48946: we have included files to process 46400 1727204639.48946: generating all_blocks data 46400 1727204639.48948: done generating all_blocks data 46400 1727204639.48950: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204639.48951: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204639.48952: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 46400 1727204639.49027: in VariableManager get_vars() 46400 1727204639.49043: done with get_vars() 46400 1727204639.49118: done processing included file 46400 1727204639.49120: iterating over new_blocks loaded from include file 46400 1727204639.49121: in VariableManager get_vars() 46400 1727204639.49132: done with get_vars() 46400 1727204639.49133: filtering new block on tags 46400 1727204639.49154: done filtering new block on tags 46400 1727204639.49156: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 46400 1727204639.49159: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 46400 1727204639.49160: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 46400 1727204639.49163: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 46400 1727204639.49400: done processing included file 46400 1727204639.49401: iterating over new_blocks loaded from include file 46400 1727204639.49402: in VariableManager get_vars() 46400 1727204639.49414: done with get_vars() 46400 1727204639.49415: filtering new block on tags 46400 1727204639.49444: done filtering new block on tags 46400 1727204639.49446: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed-node2 => (item=tasks/get_NetworkManager_NVR.yml) 46400 1727204639.49448: extending task lists for all hosts with included blocks 46400 1727204639.50108: done extending task lists 46400 1727204639.50110: done processing included files 46400 1727204639.50110: results queue empty 46400 1727204639.50111: checking for any_errors_fatal 46400 1727204639.50112: done checking for any_errors_fatal 46400 1727204639.50112: checking for max_fail_percentage 46400 1727204639.50113: done checking for max_fail_percentage 46400 1727204639.50114: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.50114: done checking to see if all hosts have failed 46400 1727204639.50115: getting the remaining hosts for this loop 46400 1727204639.50116: done getting the remaining hosts for this loop 46400 1727204639.50117: getting the next task for host managed-node2 46400 1727204639.50120: done getting next task for host managed-node2 46400 1727204639.50121: ^ task is: TASK: Include the task 'get_profile_stat.yml' 46400 1727204639.50123: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.50124: getting variables 46400 1727204639.50129: in VariableManager get_vars() 46400 1727204639.50137: Calling all_inventory to load vars for managed-node2 46400 1727204639.50139: Calling groups_inventory to load vars for managed-node2 46400 1727204639.50140: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.50144: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.50146: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.50147: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.50827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.51818: done with get_vars() 46400 1727204639.51834: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.116) 0:02:09.803 ***** 46400 1727204639.51890: entering _queue_task() for managed-node2/include_tasks 46400 1727204639.52147: worker is 1 (out of 1 available) 46400 1727204639.52161: exiting _queue_task() for managed-node2/include_tasks 46400 1727204639.52177: done queuing things up, now waiting for results queue to drain 46400 1727204639.52179: waiting for pending results... 46400 1727204639.52372: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 46400 1727204639.52452: in run() - task 0affcd87-79f5-1303-fda8-000000002804 46400 1727204639.52469: variable 'ansible_search_path' from source: unknown 46400 1727204639.52473: variable 'ansible_search_path' from source: unknown 46400 1727204639.52501: calling self._execute() 46400 1727204639.52586: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.52590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.52598: variable 'omit' from source: magic vars 46400 1727204639.52896: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.52907: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.52913: _execute() done 46400 1727204639.52917: dumping result to json 46400 1727204639.52920: done dumping result, returning 46400 1727204639.52924: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1303-fda8-000000002804] 46400 1727204639.52931: sending task result for task 0affcd87-79f5-1303-fda8-000000002804 46400 1727204639.53032: done sending task result for task 0affcd87-79f5-1303-fda8-000000002804 46400 1727204639.53035: WORKER PROCESS EXITING 46400 1727204639.53078: no more pending results, returning what we have 46400 1727204639.53083: in VariableManager get_vars() 46400 1727204639.53135: Calling all_inventory to load vars for managed-node2 46400 1727204639.53138: Calling groups_inventory to load vars for managed-node2 46400 1727204639.53141: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.53165: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.53168: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.53171: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.54011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.54949: done with get_vars() 46400 1727204639.54968: variable 'ansible_search_path' from source: unknown 46400 1727204639.54969: variable 'ansible_search_path' from source: unknown 46400 1727204639.54976: variable 'item' from source: include params 46400 1727204639.55066: variable 'item' from source: include params 46400 1727204639.55092: we have included files to process 46400 1727204639.55093: generating all_blocks data 46400 1727204639.55094: done generating all_blocks data 46400 1727204639.55095: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204639.55096: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204639.55097: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 46400 1727204639.55721: done processing included file 46400 1727204639.55723: iterating over new_blocks loaded from include file 46400 1727204639.55724: in VariableManager get_vars() 46400 1727204639.55738: done with get_vars() 46400 1727204639.55739: filtering new block on tags 46400 1727204639.55789: done filtering new block on tags 46400 1727204639.55791: in VariableManager get_vars() 46400 1727204639.55805: done with get_vars() 46400 1727204639.55806: filtering new block on tags 46400 1727204639.55840: done filtering new block on tags 46400 1727204639.55841: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 46400 1727204639.55845: extending task lists for all hosts with included blocks 46400 1727204639.56012: done extending task lists 46400 1727204639.56013: done processing included files 46400 1727204639.56013: results queue empty 46400 1727204639.56014: checking for any_errors_fatal 46400 1727204639.56018: done checking for any_errors_fatal 46400 1727204639.56018: checking for max_fail_percentage 46400 1727204639.56019: done checking for max_fail_percentage 46400 1727204639.56019: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.56020: done checking to see if all hosts have failed 46400 1727204639.56020: getting the remaining hosts for this loop 46400 1727204639.56021: done getting the remaining hosts for this loop 46400 1727204639.56023: getting the next task for host managed-node2 46400 1727204639.56026: done getting next task for host managed-node2 46400 1727204639.56027: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204639.56030: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.56031: getting variables 46400 1727204639.56032: in VariableManager get_vars() 46400 1727204639.56041: Calling all_inventory to load vars for managed-node2 46400 1727204639.56043: Calling groups_inventory to load vars for managed-node2 46400 1727204639.56044: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.56048: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.56050: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.56052: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.56828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.58081: done with get_vars() 46400 1727204639.58116: done getting variables 46400 1727204639.58173: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.063) 0:02:09.866 ***** 46400 1727204639.58209: entering _queue_task() for managed-node2/set_fact 46400 1727204639.58689: worker is 1 (out of 1 available) 46400 1727204639.58757: exiting _queue_task() for managed-node2/set_fact 46400 1727204639.58804: done queuing things up, now waiting for results queue to drain 46400 1727204639.58806: waiting for pending results... 46400 1727204639.59012: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 46400 1727204639.59108: in run() - task 0affcd87-79f5-1303-fda8-000000002888 46400 1727204639.59118: variable 'ansible_search_path' from source: unknown 46400 1727204639.59122: variable 'ansible_search_path' from source: unknown 46400 1727204639.59151: calling self._execute() 46400 1727204639.59228: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.59234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.59243: variable 'omit' from source: magic vars 46400 1727204639.59533: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.59543: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.59549: variable 'omit' from source: magic vars 46400 1727204639.59591: variable 'omit' from source: magic vars 46400 1727204639.59615: variable 'omit' from source: magic vars 46400 1727204639.59658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204639.59689: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204639.59709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204639.59722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.59736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.59759: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204639.59762: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.59770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.59840: Set connection var ansible_shell_type to sh 46400 1727204639.59847: Set connection var ansible_shell_executable to /bin/sh 46400 1727204639.59852: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204639.59857: Set connection var ansible_connection to ssh 46400 1727204639.59866: Set connection var ansible_pipelining to False 46400 1727204639.59873: Set connection var ansible_timeout to 10 46400 1727204639.59892: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.59895: variable 'ansible_connection' from source: unknown 46400 1727204639.59900: variable 'ansible_module_compression' from source: unknown 46400 1727204639.59902: variable 'ansible_shell_type' from source: unknown 46400 1727204639.59905: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.59907: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.59909: variable 'ansible_pipelining' from source: unknown 46400 1727204639.59911: variable 'ansible_timeout' from source: unknown 46400 1727204639.59914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.60024: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204639.60032: variable 'omit' from source: magic vars 46400 1727204639.60038: starting attempt loop 46400 1727204639.60041: running the handler 46400 1727204639.60054: handler run complete 46400 1727204639.60064: attempt loop complete, returning result 46400 1727204639.60072: _execute() done 46400 1727204639.60075: dumping result to json 46400 1727204639.60078: done dumping result, returning 46400 1727204639.60085: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1303-fda8-000000002888] 46400 1727204639.60090: sending task result for task 0affcd87-79f5-1303-fda8-000000002888 46400 1727204639.60181: done sending task result for task 0affcd87-79f5-1303-fda8-000000002888 46400 1727204639.60184: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 46400 1727204639.60233: no more pending results, returning what we have 46400 1727204639.60237: results queue empty 46400 1727204639.60238: checking for any_errors_fatal 46400 1727204639.60240: done checking for any_errors_fatal 46400 1727204639.60241: checking for max_fail_percentage 46400 1727204639.60243: done checking for max_fail_percentage 46400 1727204639.60243: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.60244: done checking to see if all hosts have failed 46400 1727204639.60245: getting the remaining hosts for this loop 46400 1727204639.60246: done getting the remaining hosts for this loop 46400 1727204639.60250: getting the next task for host managed-node2 46400 1727204639.60263: done getting next task for host managed-node2 46400 1727204639.60267: ^ task is: TASK: Stat profile file 46400 1727204639.60278: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.60284: getting variables 46400 1727204639.60285: in VariableManager get_vars() 46400 1727204639.60331: Calling all_inventory to load vars for managed-node2 46400 1727204639.60334: Calling groups_inventory to load vars for managed-node2 46400 1727204639.60338: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.60348: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.60350: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.60352: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.62150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.63862: done with get_vars() 46400 1727204639.63894: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.057) 0:02:09.924 ***** 46400 1727204639.63994: entering _queue_task() for managed-node2/stat 46400 1727204639.64352: worker is 1 (out of 1 available) 46400 1727204639.64367: exiting _queue_task() for managed-node2/stat 46400 1727204639.64380: done queuing things up, now waiting for results queue to drain 46400 1727204639.64381: waiting for pending results... 46400 1727204639.64685: running TaskExecutor() for managed-node2/TASK: Stat profile file 46400 1727204639.64812: in run() - task 0affcd87-79f5-1303-fda8-000000002889 46400 1727204639.64834: variable 'ansible_search_path' from source: unknown 46400 1727204639.64842: variable 'ansible_search_path' from source: unknown 46400 1727204639.64881: calling self._execute() 46400 1727204639.64984: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.64997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.65012: variable 'omit' from source: magic vars 46400 1727204639.65436: variable 'ansible_distribution_major_version' from source: facts 46400 1727204639.65454: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204639.65468: variable 'omit' from source: magic vars 46400 1727204639.65528: variable 'omit' from source: magic vars 46400 1727204639.65645: variable 'profile' from source: play vars 46400 1727204639.65657: variable 'interface' from source: play vars 46400 1727204639.65733: variable 'interface' from source: play vars 46400 1727204639.65758: variable 'omit' from source: magic vars 46400 1727204639.65817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204639.65862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204639.65894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204639.65922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.65938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204639.65976: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204639.65986: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.65995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.66102: Set connection var ansible_shell_type to sh 46400 1727204639.66119: Set connection var ansible_shell_executable to /bin/sh 46400 1727204639.66134: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204639.66145: Set connection var ansible_connection to ssh 46400 1727204639.66155: Set connection var ansible_pipelining to False 46400 1727204639.66168: Set connection var ansible_timeout to 10 46400 1727204639.66198: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.66206: variable 'ansible_connection' from source: unknown 46400 1727204639.66213: variable 'ansible_module_compression' from source: unknown 46400 1727204639.66220: variable 'ansible_shell_type' from source: unknown 46400 1727204639.66227: variable 'ansible_shell_executable' from source: unknown 46400 1727204639.66233: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204639.66247: variable 'ansible_pipelining' from source: unknown 46400 1727204639.66254: variable 'ansible_timeout' from source: unknown 46400 1727204639.66263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204639.66486: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204639.66504: variable 'omit' from source: magic vars 46400 1727204639.66514: starting attempt loop 46400 1727204639.66521: running the handler 46400 1727204639.66539: _low_level_execute_command(): starting 46400 1727204639.66551: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204639.67340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204639.67357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.67376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.67397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.67447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.67460: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204639.67477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.67497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204639.67510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204639.67522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204639.67537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.67554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.67573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.67588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.67600: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204639.67615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.67698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.67722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.67740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.67821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.69484: stdout chunk (state=3): >>>/root <<< 46400 1727204639.69684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.69687: stdout chunk (state=3): >>><<< 46400 1727204639.69690: stderr chunk (state=3): >>><<< 46400 1727204639.69812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.69816: _low_level_execute_command(): starting 46400 1727204639.69819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351 `" && echo ansible-tmp-1727204639.6971035-55491-175988409743351="` echo /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351 `" ) && sleep 0' 46400 1727204639.70384: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.70388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.70420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.70424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.70427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.70484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.70487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.70532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.72412: stdout chunk (state=3): >>>ansible-tmp-1727204639.6971035-55491-175988409743351=/root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351 <<< 46400 1727204639.72515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.72579: stderr chunk (state=3): >>><<< 46400 1727204639.72582: stdout chunk (state=3): >>><<< 46400 1727204639.72599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.6971035-55491-175988409743351=/root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.72642: variable 'ansible_module_compression' from source: unknown 46400 1727204639.72695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204639.72727: variable 'ansible_facts' from source: unknown 46400 1727204639.72781: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/AnsiballZ_stat.py 46400 1727204639.72891: Sending initial data 46400 1727204639.72900: Sent initial data (153 bytes) 46400 1727204639.73581: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.73585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.73621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.73624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.73626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.73680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.73683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.73731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.75463: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204639.75530: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204639.75568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp4yoevaab /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/AnsiballZ_stat.py <<< 46400 1727204639.75740: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204639.76438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.76555: stderr chunk (state=3): >>><<< 46400 1727204639.76558: stdout chunk (state=3): >>><<< 46400 1727204639.76572: done transferring module to remote 46400 1727204639.76582: _low_level_execute_command(): starting 46400 1727204639.76587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/ /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/AnsiballZ_stat.py && sleep 0' 46400 1727204639.77042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.77046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.77068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.77087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.77090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.77147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.77151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.77199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.78918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.79030: stderr chunk (state=3): >>><<< 46400 1727204639.79043: stdout chunk (state=3): >>><<< 46400 1727204639.79078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.79082: _low_level_execute_command(): starting 46400 1727204639.79085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/AnsiballZ_stat.py && sleep 0' 46400 1727204639.79828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204639.79832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.79835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.79840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.79842: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204639.79844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.79846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204639.79848: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204639.79850: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204639.79852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.79854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.79856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.79895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204639.79898: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204639.79900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.79951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204639.79977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.79995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.80085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.93083: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204639.93997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204639.94058: stderr chunk (state=3): >>><<< 46400 1727204639.94067: stdout chunk (state=3): >>><<< 46400 1727204639.94082: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204639.94107: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204639.94117: _low_level_execute_command(): starting 46400 1727204639.94122: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.6971035-55491-175988409743351/ > /dev/null 2>&1 && sleep 0' 46400 1727204639.94606: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204639.94612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204639.94648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204639.94654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204639.94661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204639.94668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204639.94715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204639.94718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204639.94766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204639.96540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204639.96601: stderr chunk (state=3): >>><<< 46400 1727204639.96607: stdout chunk (state=3): >>><<< 46400 1727204639.96626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204639.96632: handler run complete 46400 1727204639.96649: attempt loop complete, returning result 46400 1727204639.96653: _execute() done 46400 1727204639.96656: dumping result to json 46400 1727204639.96658: done dumping result, returning 46400 1727204639.96668: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-1303-fda8-000000002889] 46400 1727204639.96674: sending task result for task 0affcd87-79f5-1303-fda8-000000002889 46400 1727204639.96776: done sending task result for task 0affcd87-79f5-1303-fda8-000000002889 46400 1727204639.96779: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204639.96831: no more pending results, returning what we have 46400 1727204639.96835: results queue empty 46400 1727204639.96836: checking for any_errors_fatal 46400 1727204639.96846: done checking for any_errors_fatal 46400 1727204639.96846: checking for max_fail_percentage 46400 1727204639.96848: done checking for max_fail_percentage 46400 1727204639.96849: checking to see if all hosts have failed and the running result is not ok 46400 1727204639.96850: done checking to see if all hosts have failed 46400 1727204639.96850: getting the remaining hosts for this loop 46400 1727204639.96852: done getting the remaining hosts for this loop 46400 1727204639.96856: getting the next task for host managed-node2 46400 1727204639.96868: done getting next task for host managed-node2 46400 1727204639.96870: ^ task is: TASK: Set NM profile exist flag based on the profile files 46400 1727204639.96877: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204639.96882: getting variables 46400 1727204639.96884: in VariableManager get_vars() 46400 1727204639.96933: Calling all_inventory to load vars for managed-node2 46400 1727204639.96936: Calling groups_inventory to load vars for managed-node2 46400 1727204639.96939: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204639.96950: Calling all_plugins_play to load vars for managed-node2 46400 1727204639.96952: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204639.96955: Calling groups_plugins_play to load vars for managed-node2 46400 1727204639.97989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204639.99487: done with get_vars() 46400 1727204639.99522: done getting variables 46400 1727204639.99597: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.356) 0:02:10.280 ***** 46400 1727204639.99634: entering _queue_task() for managed-node2/set_fact 46400 1727204640.00047: worker is 1 (out of 1 available) 46400 1727204640.00062: exiting _queue_task() for managed-node2/set_fact 46400 1727204640.00078: done queuing things up, now waiting for results queue to drain 46400 1727204640.00080: waiting for pending results... 46400 1727204640.00391: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 46400 1727204640.00544: in run() - task 0affcd87-79f5-1303-fda8-00000000288a 46400 1727204640.00580: variable 'ansible_search_path' from source: unknown 46400 1727204640.00590: variable 'ansible_search_path' from source: unknown 46400 1727204640.00635: calling self._execute() 46400 1727204640.00748: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.00765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.00782: variable 'omit' from source: magic vars 46400 1727204640.01203: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.01222: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.01363: variable 'profile_stat' from source: set_fact 46400 1727204640.01382: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204640.01395: when evaluation is False, skipping this task 46400 1727204640.01403: _execute() done 46400 1727204640.01411: dumping result to json 46400 1727204640.01419: done dumping result, returning 46400 1727204640.01430: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1303-fda8-00000000288a] 46400 1727204640.01435: sending task result for task 0affcd87-79f5-1303-fda8-00000000288a 46400 1727204640.01530: done sending task result for task 0affcd87-79f5-1303-fda8-00000000288a 46400 1727204640.01534: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204640.01588: no more pending results, returning what we have 46400 1727204640.01593: results queue empty 46400 1727204640.01594: checking for any_errors_fatal 46400 1727204640.01606: done checking for any_errors_fatal 46400 1727204640.01606: checking for max_fail_percentage 46400 1727204640.01608: done checking for max_fail_percentage 46400 1727204640.01609: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.01610: done checking to see if all hosts have failed 46400 1727204640.01610: getting the remaining hosts for this loop 46400 1727204640.01613: done getting the remaining hosts for this loop 46400 1727204640.01616: getting the next task for host managed-node2 46400 1727204640.01625: done getting next task for host managed-node2 46400 1727204640.01628: ^ task is: TASK: Get NM profile info 46400 1727204640.01636: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.01641: getting variables 46400 1727204640.01642: in VariableManager get_vars() 46400 1727204640.01698: Calling all_inventory to load vars for managed-node2 46400 1727204640.01701: Calling groups_inventory to load vars for managed-node2 46400 1727204640.01704: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.01716: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.01719: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.01721: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.02590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.03658: done with get_vars() 46400 1727204640.03679: done getting variables 46400 1727204640.03726: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.041) 0:02:10.322 ***** 46400 1727204640.03751: entering _queue_task() for managed-node2/shell 46400 1727204640.04002: worker is 1 (out of 1 available) 46400 1727204640.04016: exiting _queue_task() for managed-node2/shell 46400 1727204640.04028: done queuing things up, now waiting for results queue to drain 46400 1727204640.04030: waiting for pending results... 46400 1727204640.04215: running TaskExecutor() for managed-node2/TASK: Get NM profile info 46400 1727204640.04294: in run() - task 0affcd87-79f5-1303-fda8-00000000288b 46400 1727204640.04306: variable 'ansible_search_path' from source: unknown 46400 1727204640.04309: variable 'ansible_search_path' from source: unknown 46400 1727204640.04336: calling self._execute() 46400 1727204640.04416: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.04421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.04428: variable 'omit' from source: magic vars 46400 1727204640.04712: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.04722: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.04728: variable 'omit' from source: magic vars 46400 1727204640.04767: variable 'omit' from source: magic vars 46400 1727204640.04843: variable 'profile' from source: play vars 46400 1727204640.04847: variable 'interface' from source: play vars 46400 1727204640.04900: variable 'interface' from source: play vars 46400 1727204640.04915: variable 'omit' from source: magic vars 46400 1727204640.04955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204640.04987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204640.05004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204640.05020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.05032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.05056: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204640.05059: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.05066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.05135: Set connection var ansible_shell_type to sh 46400 1727204640.05148: Set connection var ansible_shell_executable to /bin/sh 46400 1727204640.05153: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204640.05158: Set connection var ansible_connection to ssh 46400 1727204640.05163: Set connection var ansible_pipelining to False 46400 1727204640.05174: Set connection var ansible_timeout to 10 46400 1727204640.05192: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.05195: variable 'ansible_connection' from source: unknown 46400 1727204640.05197: variable 'ansible_module_compression' from source: unknown 46400 1727204640.05200: variable 'ansible_shell_type' from source: unknown 46400 1727204640.05202: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.05205: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.05209: variable 'ansible_pipelining' from source: unknown 46400 1727204640.05212: variable 'ansible_timeout' from source: unknown 46400 1727204640.05216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.05324: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204640.05335: variable 'omit' from source: magic vars 46400 1727204640.05343: starting attempt loop 46400 1727204640.05346: running the handler 46400 1727204640.05352: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204640.05375: _low_level_execute_command(): starting 46400 1727204640.05383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204640.05920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.05937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.05950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.05965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.05978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.06029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.06034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.06046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.06096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.07691: stdout chunk (state=3): >>>/root <<< 46400 1727204640.07794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.07883: stderr chunk (state=3): >>><<< 46400 1727204640.07894: stdout chunk (state=3): >>><<< 46400 1727204640.07924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.07940: _low_level_execute_command(): starting 46400 1727204640.07951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645 `" && echo ansible-tmp-1727204640.0792632-55514-169368341758645="` echo /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645 `" ) && sleep 0' 46400 1727204640.08408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.08421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.08440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.08454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204640.08472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.08511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.08522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.08572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.10417: stdout chunk (state=3): >>>ansible-tmp-1727204640.0792632-55514-169368341758645=/root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645 <<< 46400 1727204640.10528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.10583: stderr chunk (state=3): >>><<< 46400 1727204640.10586: stdout chunk (state=3): >>><<< 46400 1727204640.10602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204640.0792632-55514-169368341758645=/root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.10629: variable 'ansible_module_compression' from source: unknown 46400 1727204640.10681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204640.10712: variable 'ansible_facts' from source: unknown 46400 1727204640.10765: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/AnsiballZ_command.py 46400 1727204640.10875: Sending initial data 46400 1727204640.10878: Sent initial data (156 bytes) 46400 1727204640.11545: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.11562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.11577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.11590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.11600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.11646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.11658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.11701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.13398: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 46400 1727204640.13411: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 46400 1727204640.13441: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204640.13489: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204640.13520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpsw3gi4aw /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/AnsiballZ_command.py <<< 46400 1727204640.13552: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204640.14605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.14751: stderr chunk (state=3): >>><<< 46400 1727204640.14760: stdout chunk (state=3): >>><<< 46400 1727204640.14789: done transferring module to remote 46400 1727204640.14805: _low_level_execute_command(): starting 46400 1727204640.14819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/ /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/AnsiballZ_command.py && sleep 0' 46400 1727204640.15498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204640.15513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.15529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.15548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.15603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.15616: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204640.15631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.15651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204640.15669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204640.15689: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204640.15705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.15719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.15734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.15746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.15756: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204640.15777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.15859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.15887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.15909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.16003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.17787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.17833: stderr chunk (state=3): >>><<< 46400 1727204640.17837: stdout chunk (state=3): >>><<< 46400 1727204640.17870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.17873: _low_level_execute_command(): starting 46400 1727204640.17876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/AnsiballZ_command.py && sleep 0' 46400 1727204640.18565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204640.18582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.18596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.18622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.18672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.18685: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204640.18698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.18718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204640.18730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204640.18740: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204640.18750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.18768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.18785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.18797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.18807: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204640.18819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.18905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.18928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.18948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.19029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.33947: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:00.321194", "end": "2024-09-24 15:04:00.338465", "delta": "0:00:00.017271", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204640.35091: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204640.35158: stderr chunk (state=3): >>><<< 46400 1727204640.35161: stdout chunk (state=3): >>><<< 46400 1727204640.35316: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:00.321194", "end": "2024-09-24 15:04:00.338465", "delta": "0:00:00.017271", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204640.35321: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204640.35329: _low_level_execute_command(): starting 46400 1727204640.35332: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204640.0792632-55514-169368341758645/ > /dev/null 2>&1 && sleep 0' 46400 1727204640.35944: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204640.35961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.35985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.36004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.36049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.36062: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204640.36086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.36105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204640.36116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204640.36127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204640.36138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.36151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.36168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.36179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.36198: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204640.36213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.36288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.36319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.36337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.36410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.38209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.38322: stderr chunk (state=3): >>><<< 46400 1727204640.38334: stdout chunk (state=3): >>><<< 46400 1727204640.38475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.38478: handler run complete 46400 1727204640.38481: Evaluated conditional (False): False 46400 1727204640.38483: attempt loop complete, returning result 46400 1727204640.38485: _execute() done 46400 1727204640.38487: dumping result to json 46400 1727204640.38488: done dumping result, returning 46400 1727204640.38490: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-1303-fda8-00000000288b] 46400 1727204640.38492: sending task result for task 0affcd87-79f5-1303-fda8-00000000288b fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017271", "end": "2024-09-24 15:04:00.338465", "rc": 1, "start": "2024-09-24 15:04:00.321194" } MSG: non-zero return code ...ignoring 46400 1727204640.38843: no more pending results, returning what we have 46400 1727204640.38847: results queue empty 46400 1727204640.38849: checking for any_errors_fatal 46400 1727204640.38855: done checking for any_errors_fatal 46400 1727204640.38856: checking for max_fail_percentage 46400 1727204640.38857: done checking for max_fail_percentage 46400 1727204640.38858: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.38862: done checking to see if all hosts have failed 46400 1727204640.38862: getting the remaining hosts for this loop 46400 1727204640.38869: done getting the remaining hosts for this loop 46400 1727204640.38874: getting the next task for host managed-node2 46400 1727204640.38884: done getting next task for host managed-node2 46400 1727204640.38887: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204640.38893: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.38897: getting variables 46400 1727204640.38899: in VariableManager get_vars() 46400 1727204640.38946: Calling all_inventory to load vars for managed-node2 46400 1727204640.38949: Calling groups_inventory to load vars for managed-node2 46400 1727204640.38953: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.38969: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.38972: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.38977: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.39611: done sending task result for task 0affcd87-79f5-1303-fda8-00000000288b 46400 1727204640.39614: WORKER PROCESS EXITING 46400 1727204640.40801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.42753: done with get_vars() 46400 1727204640.42798: done getting variables 46400 1727204640.42867: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.391) 0:02:10.713 ***** 46400 1727204640.42918: entering _queue_task() for managed-node2/set_fact 46400 1727204640.43325: worker is 1 (out of 1 available) 46400 1727204640.43340: exiting _queue_task() for managed-node2/set_fact 46400 1727204640.43353: done queuing things up, now waiting for results queue to drain 46400 1727204640.43355: waiting for pending results... 46400 1727204640.43668: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 46400 1727204640.43810: in run() - task 0affcd87-79f5-1303-fda8-00000000288c 46400 1727204640.43832: variable 'ansible_search_path' from source: unknown 46400 1727204640.43840: variable 'ansible_search_path' from source: unknown 46400 1727204640.43889: calling self._execute() 46400 1727204640.44004: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.44023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.44036: variable 'omit' from source: magic vars 46400 1727204640.44473: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.44491: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.44671: variable 'nm_profile_exists' from source: set_fact 46400 1727204640.44691: Evaluated conditional (nm_profile_exists.rc == 0): False 46400 1727204640.44699: when evaluation is False, skipping this task 46400 1727204640.44705: _execute() done 46400 1727204640.44711: dumping result to json 46400 1727204640.44718: done dumping result, returning 46400 1727204640.44727: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1303-fda8-00000000288c] 46400 1727204640.44742: sending task result for task 0affcd87-79f5-1303-fda8-00000000288c 46400 1727204640.44877: done sending task result for task 0affcd87-79f5-1303-fda8-00000000288c skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 46400 1727204640.44929: no more pending results, returning what we have 46400 1727204640.44934: results queue empty 46400 1727204640.44935: checking for any_errors_fatal 46400 1727204640.44946: done checking for any_errors_fatal 46400 1727204640.44947: checking for max_fail_percentage 46400 1727204640.44949: done checking for max_fail_percentage 46400 1727204640.44950: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.44951: done checking to see if all hosts have failed 46400 1727204640.44952: getting the remaining hosts for this loop 46400 1727204640.44954: done getting the remaining hosts for this loop 46400 1727204640.44959: getting the next task for host managed-node2 46400 1727204640.44978: done getting next task for host managed-node2 46400 1727204640.44982: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204640.44992: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.44997: getting variables 46400 1727204640.44999: in VariableManager get_vars() 46400 1727204640.45058: Calling all_inventory to load vars for managed-node2 46400 1727204640.45067: Calling groups_inventory to load vars for managed-node2 46400 1727204640.45071: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.45087: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.45091: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.45095: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.46151: WORKER PROCESS EXITING 46400 1727204640.47395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.49285: done with get_vars() 46400 1727204640.49328: done getting variables 46400 1727204640.49400: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204640.49543: variable 'profile' from source: play vars 46400 1727204640.49548: variable 'interface' from source: play vars 46400 1727204640.49614: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.067) 0:02:10.781 ***** 46400 1727204640.49662: entering _queue_task() for managed-node2/command 46400 1727204640.50044: worker is 1 (out of 1 available) 46400 1727204640.50061: exiting _queue_task() for managed-node2/command 46400 1727204640.50090: done queuing things up, now waiting for results queue to drain 46400 1727204640.50092: waiting for pending results... 46400 1727204640.50404: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 46400 1727204640.50571: in run() - task 0affcd87-79f5-1303-fda8-00000000288e 46400 1727204640.50595: variable 'ansible_search_path' from source: unknown 46400 1727204640.50604: variable 'ansible_search_path' from source: unknown 46400 1727204640.50654: calling self._execute() 46400 1727204640.50775: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.50786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.50801: variable 'omit' from source: magic vars 46400 1727204640.51215: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.51232: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.51381: variable 'profile_stat' from source: set_fact 46400 1727204640.51405: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204640.51412: when evaluation is False, skipping this task 46400 1727204640.51418: _execute() done 46400 1727204640.51425: dumping result to json 46400 1727204640.51431: done dumping result, returning 46400 1727204640.51439: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000288e] 46400 1727204640.51448: sending task result for task 0affcd87-79f5-1303-fda8-00000000288e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204640.51607: no more pending results, returning what we have 46400 1727204640.51612: results queue empty 46400 1727204640.51613: checking for any_errors_fatal 46400 1727204640.51622: done checking for any_errors_fatal 46400 1727204640.51623: checking for max_fail_percentage 46400 1727204640.51625: done checking for max_fail_percentage 46400 1727204640.51626: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.51627: done checking to see if all hosts have failed 46400 1727204640.51627: getting the remaining hosts for this loop 46400 1727204640.51629: done getting the remaining hosts for this loop 46400 1727204640.51634: getting the next task for host managed-node2 46400 1727204640.51645: done getting next task for host managed-node2 46400 1727204640.51648: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 46400 1727204640.51655: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.51664: getting variables 46400 1727204640.51666: in VariableManager get_vars() 46400 1727204640.51720: Calling all_inventory to load vars for managed-node2 46400 1727204640.51723: Calling groups_inventory to load vars for managed-node2 46400 1727204640.51727: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.51741: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.51745: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.51748: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.52757: done sending task result for task 0affcd87-79f5-1303-fda8-00000000288e 46400 1727204640.52765: WORKER PROCESS EXITING 46400 1727204640.53774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.55634: done with get_vars() 46400 1727204640.55674: done getting variables 46400 1727204640.55739: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204640.55875: variable 'profile' from source: play vars 46400 1727204640.55879: variable 'interface' from source: play vars 46400 1727204640.55950: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.063) 0:02:10.844 ***** 46400 1727204640.55989: entering _queue_task() for managed-node2/set_fact 46400 1727204640.56389: worker is 1 (out of 1 available) 46400 1727204640.56401: exiting _queue_task() for managed-node2/set_fact 46400 1727204640.56413: done queuing things up, now waiting for results queue to drain 46400 1727204640.56415: waiting for pending results... 46400 1727204640.56732: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 46400 1727204640.56902: in run() - task 0affcd87-79f5-1303-fda8-00000000288f 46400 1727204640.56922: variable 'ansible_search_path' from source: unknown 46400 1727204640.56929: variable 'ansible_search_path' from source: unknown 46400 1727204640.56978: calling self._execute() 46400 1727204640.57099: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.57115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.57128: variable 'omit' from source: magic vars 46400 1727204640.57548: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.57571: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.57718: variable 'profile_stat' from source: set_fact 46400 1727204640.57741: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204640.57753: when evaluation is False, skipping this task 46400 1727204640.57767: _execute() done 46400 1727204640.57776: dumping result to json 46400 1727204640.57783: done dumping result, returning 46400 1727204640.57791: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-00000000288f] 46400 1727204640.57801: sending task result for task 0affcd87-79f5-1303-fda8-00000000288f skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204640.57971: no more pending results, returning what we have 46400 1727204640.57977: results queue empty 46400 1727204640.57978: checking for any_errors_fatal 46400 1727204640.57987: done checking for any_errors_fatal 46400 1727204640.57988: checking for max_fail_percentage 46400 1727204640.57990: done checking for max_fail_percentage 46400 1727204640.57991: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.57992: done checking to see if all hosts have failed 46400 1727204640.57993: getting the remaining hosts for this loop 46400 1727204640.57995: done getting the remaining hosts for this loop 46400 1727204640.57999: getting the next task for host managed-node2 46400 1727204640.58009: done getting next task for host managed-node2 46400 1727204640.58012: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 46400 1727204640.58019: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.58025: getting variables 46400 1727204640.58027: in VariableManager get_vars() 46400 1727204640.58086: Calling all_inventory to load vars for managed-node2 46400 1727204640.58089: Calling groups_inventory to load vars for managed-node2 46400 1727204640.58093: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.58108: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.58111: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.58114: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.59171: done sending task result for task 0affcd87-79f5-1303-fda8-00000000288f 46400 1727204640.59175: WORKER PROCESS EXITING 46400 1727204640.60270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.62108: done with get_vars() 46400 1727204640.62147: done getting variables 46400 1727204640.62225: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204640.62332: variable 'profile' from source: play vars 46400 1727204640.62336: variable 'interface' from source: play vars 46400 1727204640.62393: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.064) 0:02:10.908 ***** 46400 1727204640.62421: entering _queue_task() for managed-node2/command 46400 1727204640.62680: worker is 1 (out of 1 available) 46400 1727204640.62693: exiting _queue_task() for managed-node2/command 46400 1727204640.62708: done queuing things up, now waiting for results queue to drain 46400 1727204640.62709: waiting for pending results... 46400 1727204640.62900: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 46400 1727204640.62983: in run() - task 0affcd87-79f5-1303-fda8-000000002890 46400 1727204640.62994: variable 'ansible_search_path' from source: unknown 46400 1727204640.62997: variable 'ansible_search_path' from source: unknown 46400 1727204640.63027: calling self._execute() 46400 1727204640.63108: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.63111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.63121: variable 'omit' from source: magic vars 46400 1727204640.63401: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.63412: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.63506: variable 'profile_stat' from source: set_fact 46400 1727204640.63514: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204640.63517: when evaluation is False, skipping this task 46400 1727204640.63520: _execute() done 46400 1727204640.63523: dumping result to json 46400 1727204640.63525: done dumping result, returning 46400 1727204640.63531: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000002890] 46400 1727204640.63536: sending task result for task 0affcd87-79f5-1303-fda8-000000002890 46400 1727204640.63624: done sending task result for task 0affcd87-79f5-1303-fda8-000000002890 46400 1727204640.63627: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204640.63685: no more pending results, returning what we have 46400 1727204640.63689: results queue empty 46400 1727204640.63690: checking for any_errors_fatal 46400 1727204640.63700: done checking for any_errors_fatal 46400 1727204640.63701: checking for max_fail_percentage 46400 1727204640.63703: done checking for max_fail_percentage 46400 1727204640.63704: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.63705: done checking to see if all hosts have failed 46400 1727204640.63705: getting the remaining hosts for this loop 46400 1727204640.63707: done getting the remaining hosts for this loop 46400 1727204640.63711: getting the next task for host managed-node2 46400 1727204640.63718: done getting next task for host managed-node2 46400 1727204640.63721: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 46400 1727204640.63727: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.63731: getting variables 46400 1727204640.63732: in VariableManager get_vars() 46400 1727204640.63788: Calling all_inventory to load vars for managed-node2 46400 1727204640.63791: Calling groups_inventory to load vars for managed-node2 46400 1727204640.63795: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.63807: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.63810: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.63813: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.65299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.66895: done with get_vars() 46400 1727204640.66933: done getting variables 46400 1727204640.67007: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204640.67119: variable 'profile' from source: play vars 46400 1727204640.67122: variable 'interface' from source: play vars 46400 1727204640.67171: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.047) 0:02:10.956 ***** 46400 1727204640.67198: entering _queue_task() for managed-node2/set_fact 46400 1727204640.67454: worker is 1 (out of 1 available) 46400 1727204640.67472: exiting _queue_task() for managed-node2/set_fact 46400 1727204640.67486: done queuing things up, now waiting for results queue to drain 46400 1727204640.67488: waiting for pending results... 46400 1727204640.67685: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 46400 1727204640.67771: in run() - task 0affcd87-79f5-1303-fda8-000000002891 46400 1727204640.67782: variable 'ansible_search_path' from source: unknown 46400 1727204640.67785: variable 'ansible_search_path' from source: unknown 46400 1727204640.67815: calling self._execute() 46400 1727204640.67898: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.67901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.67911: variable 'omit' from source: magic vars 46400 1727204640.68204: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.68213: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.68306: variable 'profile_stat' from source: set_fact 46400 1727204640.68315: Evaluated conditional (profile_stat.stat.exists): False 46400 1727204640.68318: when evaluation is False, skipping this task 46400 1727204640.68321: _execute() done 46400 1727204640.68323: dumping result to json 46400 1727204640.68326: done dumping result, returning 46400 1727204640.68332: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcd87-79f5-1303-fda8-000000002891] 46400 1727204640.68337: sending task result for task 0affcd87-79f5-1303-fda8-000000002891 46400 1727204640.68434: done sending task result for task 0affcd87-79f5-1303-fda8-000000002891 46400 1727204640.68437: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 46400 1727204640.68490: no more pending results, returning what we have 46400 1727204640.68495: results queue empty 46400 1727204640.68496: checking for any_errors_fatal 46400 1727204640.68504: done checking for any_errors_fatal 46400 1727204640.68505: checking for max_fail_percentage 46400 1727204640.68506: done checking for max_fail_percentage 46400 1727204640.68507: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.68508: done checking to see if all hosts have failed 46400 1727204640.68509: getting the remaining hosts for this loop 46400 1727204640.68510: done getting the remaining hosts for this loop 46400 1727204640.68514: getting the next task for host managed-node2 46400 1727204640.68523: done getting next task for host managed-node2 46400 1727204640.68526: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 46400 1727204640.68531: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.68535: getting variables 46400 1727204640.68537: in VariableManager get_vars() 46400 1727204640.68588: Calling all_inventory to load vars for managed-node2 46400 1727204640.68596: Calling groups_inventory to load vars for managed-node2 46400 1727204640.68600: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.68613: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.68615: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.68618: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.70169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.71106: done with get_vars() 46400 1727204640.71126: done getting variables 46400 1727204640.71175: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204640.71271: variable 'profile' from source: play vars 46400 1727204640.71274: variable 'interface' from source: play vars 46400 1727204640.71318: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.041) 0:02:10.998 ***** 46400 1727204640.71343: entering _queue_task() for managed-node2/assert 46400 1727204640.71601: worker is 1 (out of 1 available) 46400 1727204640.71617: exiting _queue_task() for managed-node2/assert 46400 1727204640.71629: done queuing things up, now waiting for results queue to drain 46400 1727204640.71631: waiting for pending results... 46400 1727204640.71829: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 46400 1727204640.71905: in run() - task 0affcd87-79f5-1303-fda8-000000002805 46400 1727204640.71916: variable 'ansible_search_path' from source: unknown 46400 1727204640.71919: variable 'ansible_search_path' from source: unknown 46400 1727204640.71951: calling self._execute() 46400 1727204640.72036: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.72041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.72050: variable 'omit' from source: magic vars 46400 1727204640.72338: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.72348: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.72354: variable 'omit' from source: magic vars 46400 1727204640.72392: variable 'omit' from source: magic vars 46400 1727204640.72467: variable 'profile' from source: play vars 46400 1727204640.72472: variable 'interface' from source: play vars 46400 1727204640.72519: variable 'interface' from source: play vars 46400 1727204640.72534: variable 'omit' from source: magic vars 46400 1727204640.72574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204640.72603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204640.72623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204640.72636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.72646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.72675: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204640.72679: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.72681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.72749: Set connection var ansible_shell_type to sh 46400 1727204640.72757: Set connection var ansible_shell_executable to /bin/sh 46400 1727204640.72766: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204640.72771: Set connection var ansible_connection to ssh 46400 1727204640.72777: Set connection var ansible_pipelining to False 46400 1727204640.72782: Set connection var ansible_timeout to 10 46400 1727204640.72801: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.72804: variable 'ansible_connection' from source: unknown 46400 1727204640.72806: variable 'ansible_module_compression' from source: unknown 46400 1727204640.72808: variable 'ansible_shell_type' from source: unknown 46400 1727204640.72811: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.72813: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.72815: variable 'ansible_pipelining' from source: unknown 46400 1727204640.72818: variable 'ansible_timeout' from source: unknown 46400 1727204640.72829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.72928: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204640.72939: variable 'omit' from source: magic vars 46400 1727204640.72943: starting attempt loop 46400 1727204640.72946: running the handler 46400 1727204640.73036: variable 'lsr_net_profile_exists' from source: set_fact 46400 1727204640.73041: Evaluated conditional (not lsr_net_profile_exists): True 46400 1727204640.73049: handler run complete 46400 1727204640.73059: attempt loop complete, returning result 46400 1727204640.73066: _execute() done 46400 1727204640.73068: dumping result to json 46400 1727204640.73071: done dumping result, returning 46400 1727204640.73078: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [0affcd87-79f5-1303-fda8-000000002805] 46400 1727204640.73083: sending task result for task 0affcd87-79f5-1303-fda8-000000002805 46400 1727204640.73173: done sending task result for task 0affcd87-79f5-1303-fda8-000000002805 46400 1727204640.73177: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204640.73222: no more pending results, returning what we have 46400 1727204640.73226: results queue empty 46400 1727204640.73227: checking for any_errors_fatal 46400 1727204640.73239: done checking for any_errors_fatal 46400 1727204640.73240: checking for max_fail_percentage 46400 1727204640.73242: done checking for max_fail_percentage 46400 1727204640.73243: checking to see if all hosts have failed and the running result is not ok 46400 1727204640.73244: done checking to see if all hosts have failed 46400 1727204640.73245: getting the remaining hosts for this loop 46400 1727204640.73246: done getting the remaining hosts for this loop 46400 1727204640.73250: getting the next task for host managed-node2 46400 1727204640.73268: done getting next task for host managed-node2 46400 1727204640.73273: ^ task is: TASK: Get NetworkManager RPM version 46400 1727204640.73277: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204640.73282: getting variables 46400 1727204640.73283: in VariableManager get_vars() 46400 1727204640.73328: Calling all_inventory to load vars for managed-node2 46400 1727204640.73330: Calling groups_inventory to load vars for managed-node2 46400 1727204640.73334: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204640.73345: Calling all_plugins_play to load vars for managed-node2 46400 1727204640.73347: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204640.73349: Calling groups_plugins_play to load vars for managed-node2 46400 1727204640.74227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204640.75296: done with get_vars() 46400 1727204640.75314: done getting variables 46400 1727204640.75362: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.040) 0:02:11.038 ***** 46400 1727204640.75394: entering _queue_task() for managed-node2/command 46400 1727204640.75647: worker is 1 (out of 1 available) 46400 1727204640.75660: exiting _queue_task() for managed-node2/command 46400 1727204640.75678: done queuing things up, now waiting for results queue to drain 46400 1727204640.75680: waiting for pending results... 46400 1727204640.75878: running TaskExecutor() for managed-node2/TASK: Get NetworkManager RPM version 46400 1727204640.75957: in run() - task 0affcd87-79f5-1303-fda8-000000002809 46400 1727204640.75973: variable 'ansible_search_path' from source: unknown 46400 1727204640.75976: variable 'ansible_search_path' from source: unknown 46400 1727204640.76008: calling self._execute() 46400 1727204640.76091: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.76095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.76105: variable 'omit' from source: magic vars 46400 1727204640.76395: variable 'ansible_distribution_major_version' from source: facts 46400 1727204640.76405: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204640.76410: variable 'omit' from source: magic vars 46400 1727204640.76450: variable 'omit' from source: magic vars 46400 1727204640.76478: variable 'omit' from source: magic vars 46400 1727204640.76514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204640.76542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204640.76562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204640.76581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.76590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204640.76614: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204640.76617: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.76619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.76693: Set connection var ansible_shell_type to sh 46400 1727204640.76702: Set connection var ansible_shell_executable to /bin/sh 46400 1727204640.76707: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204640.76711: Set connection var ansible_connection to ssh 46400 1727204640.76716: Set connection var ansible_pipelining to False 46400 1727204640.76721: Set connection var ansible_timeout to 10 46400 1727204640.76740: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.76743: variable 'ansible_connection' from source: unknown 46400 1727204640.76745: variable 'ansible_module_compression' from source: unknown 46400 1727204640.76748: variable 'ansible_shell_type' from source: unknown 46400 1727204640.76751: variable 'ansible_shell_executable' from source: unknown 46400 1727204640.76754: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204640.76756: variable 'ansible_pipelining' from source: unknown 46400 1727204640.76758: variable 'ansible_timeout' from source: unknown 46400 1727204640.76767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204640.76873: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204640.76883: variable 'omit' from source: magic vars 46400 1727204640.76886: starting attempt loop 46400 1727204640.76888: running the handler 46400 1727204640.76904: _low_level_execute_command(): starting 46400 1727204640.76912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204640.77447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.77457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.77493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.77508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.77562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.77580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.77644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.79297: stdout chunk (state=3): >>>/root <<< 46400 1727204640.79395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.79449: stderr chunk (state=3): >>><<< 46400 1727204640.79456: stdout chunk (state=3): >>><<< 46400 1727204640.79486: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.79498: _low_level_execute_command(): starting 46400 1727204640.79504: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384 `" && echo ansible-tmp-1727204640.7948568-55538-277720273980384="` echo /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384 `" ) && sleep 0' 46400 1727204640.79972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.79985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.80013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.80027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.80078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.80090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.80139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.81992: stdout chunk (state=3): >>>ansible-tmp-1727204640.7948568-55538-277720273980384=/root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384 <<< 46400 1727204640.82101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.82166: stderr chunk (state=3): >>><<< 46400 1727204640.82170: stdout chunk (state=3): >>><<< 46400 1727204640.82187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204640.7948568-55538-277720273980384=/root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.82214: variable 'ansible_module_compression' from source: unknown 46400 1727204640.82263: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204640.82298: variable 'ansible_facts' from source: unknown 46400 1727204640.82346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/AnsiballZ_command.py 46400 1727204640.82464: Sending initial data 46400 1727204640.82476: Sent initial data (156 bytes) 46400 1727204640.83176: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204640.83182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.83213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.83225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.83283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.83294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.83337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.85044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204640.85079: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204640.85115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp9d1f2_b2 /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/AnsiballZ_command.py <<< 46400 1727204640.85148: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204640.85941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.86053: stderr chunk (state=3): >>><<< 46400 1727204640.86056: stdout chunk (state=3): >>><<< 46400 1727204640.86077: done transferring module to remote 46400 1727204640.86086: _low_level_execute_command(): starting 46400 1727204640.86090: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/ /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/AnsiballZ_command.py && sleep 0' 46400 1727204640.86553: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.86563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.86601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.86613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204640.86624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.86674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.86682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.86689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.86739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204640.88440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204640.88496: stderr chunk (state=3): >>><<< 46400 1727204640.88500: stdout chunk (state=3): >>><<< 46400 1727204640.88520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204640.88529: _low_level_execute_command(): starting 46400 1727204640.88532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/AnsiballZ_command.py && sleep 0' 46400 1727204640.88999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204640.89013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204640.89030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204640.89043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204640.89055: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204640.89104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204640.89109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204640.89123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204640.89181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.24381: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.51.0-1.el9", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-24 15:04:01.021148", "end": "2024-09-24 15:04:01.242844", "delta": "0:00:00.221696", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204641.25753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204641.25787: stderr chunk (state=3): >>><<< 46400 1727204641.25791: stdout chunk (state=3): >>><<< 46400 1727204641.25870: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.51.0-1.el9", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-24 15:04:01.021148", "end": "2024-09-24 15:04:01.242844", "delta": "0:00:00.221696", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204641.25875: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204641.25879: _low_level_execute_command(): starting 46400 1727204641.25881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204640.7948568-55538-277720273980384/ > /dev/null 2>&1 && sleep 0' 46400 1727204641.26542: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204641.26558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.26578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.26598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.26644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.26658: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204641.26678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.26698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204641.26710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204641.26721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204641.26733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.26747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.26765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.26781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.26794: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204641.26807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.26887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.26905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.26920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.27005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.28841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204641.28945: stderr chunk (state=3): >>><<< 46400 1727204641.28954: stdout chunk (state=3): >>><<< 46400 1727204641.28991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204641.28997: handler run complete 46400 1727204641.29026: Evaluated conditional (False): False 46400 1727204641.29036: attempt loop complete, returning result 46400 1727204641.29039: _execute() done 46400 1727204641.29042: dumping result to json 46400 1727204641.29048: done dumping result, returning 46400 1727204641.29056: done running TaskExecutor() for managed-node2/TASK: Get NetworkManager RPM version [0affcd87-79f5-1303-fda8-000000002809] 46400 1727204641.29068: sending task result for task 0affcd87-79f5-1303-fda8-000000002809 ok: [managed-node2] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.221696", "end": "2024-09-24 15:04:01.242844", "rc": 0, "start": "2024-09-24 15:04:01.021148" } STDOUT: NetworkManager-1.51.0-1.el9 46400 1727204641.29276: no more pending results, returning what we have 46400 1727204641.29281: results queue empty 46400 1727204641.29284: checking for any_errors_fatal 46400 1727204641.29292: done checking for any_errors_fatal 46400 1727204641.29293: checking for max_fail_percentage 46400 1727204641.29296: done checking for max_fail_percentage 46400 1727204641.29297: checking to see if all hosts have failed and the running result is not ok 46400 1727204641.29298: done checking to see if all hosts have failed 46400 1727204641.29298: getting the remaining hosts for this loop 46400 1727204641.29300: done getting the remaining hosts for this loop 46400 1727204641.29304: getting the next task for host managed-node2 46400 1727204641.29315: done getting next task for host managed-node2 46400 1727204641.29318: ^ task is: TASK: Store NetworkManager version 46400 1727204641.29322: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204641.29327: getting variables 46400 1727204641.29329: in VariableManager get_vars() 46400 1727204641.29395: Calling all_inventory to load vars for managed-node2 46400 1727204641.29398: Calling groups_inventory to load vars for managed-node2 46400 1727204641.29402: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.29416: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.29419: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.29423: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.29948: done sending task result for task 0affcd87-79f5-1303-fda8-000000002809 46400 1727204641.29953: WORKER PROCESS EXITING 46400 1727204641.31391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.33296: done with get_vars() 46400 1727204641.33334: done getting variables 46400 1727204641.33424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.580) 0:02:11.619 ***** 46400 1727204641.33463: entering _queue_task() for managed-node2/set_fact 46400 1727204641.33876: worker is 1 (out of 1 available) 46400 1727204641.33889: exiting _queue_task() for managed-node2/set_fact 46400 1727204641.33902: done queuing things up, now waiting for results queue to drain 46400 1727204641.33904: waiting for pending results... 46400 1727204641.34240: running TaskExecutor() for managed-node2/TASK: Store NetworkManager version 46400 1727204641.34377: in run() - task 0affcd87-79f5-1303-fda8-00000000280a 46400 1727204641.34396: variable 'ansible_search_path' from source: unknown 46400 1727204641.34403: variable 'ansible_search_path' from source: unknown 46400 1727204641.34443: calling self._execute() 46400 1727204641.34559: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.34571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.34577: variable 'omit' from source: magic vars 46400 1727204641.35017: variable 'ansible_distribution_major_version' from source: facts 46400 1727204641.35035: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204641.35051: variable 'omit' from source: magic vars 46400 1727204641.35108: variable 'omit' from source: magic vars 46400 1727204641.35226: variable '__rpm_q_networkmanager' from source: set_fact 46400 1727204641.35253: variable 'omit' from source: magic vars 46400 1727204641.35306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204641.35340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204641.35372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204641.35393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.35403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.35433: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204641.35437: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.35440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.35544: Set connection var ansible_shell_type to sh 46400 1727204641.35554: Set connection var ansible_shell_executable to /bin/sh 46400 1727204641.35559: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204641.35572: Set connection var ansible_connection to ssh 46400 1727204641.35583: Set connection var ansible_pipelining to False 46400 1727204641.35595: Set connection var ansible_timeout to 10 46400 1727204641.35622: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.35625: variable 'ansible_connection' from source: unknown 46400 1727204641.35628: variable 'ansible_module_compression' from source: unknown 46400 1727204641.35630: variable 'ansible_shell_type' from source: unknown 46400 1727204641.35632: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.35635: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.35637: variable 'ansible_pipelining' from source: unknown 46400 1727204641.35640: variable 'ansible_timeout' from source: unknown 46400 1727204641.35645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.35816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204641.35826: variable 'omit' from source: magic vars 46400 1727204641.35831: starting attempt loop 46400 1727204641.35834: running the handler 46400 1727204641.35845: handler run complete 46400 1727204641.35855: attempt loop complete, returning result 46400 1727204641.35858: _execute() done 46400 1727204641.35861: dumping result to json 46400 1727204641.35867: done dumping result, returning 46400 1727204641.35874: done running TaskExecutor() for managed-node2/TASK: Store NetworkManager version [0affcd87-79f5-1303-fda8-00000000280a] 46400 1727204641.35880: sending task result for task 0affcd87-79f5-1303-fda8-00000000280a 46400 1727204641.35979: done sending task result for task 0affcd87-79f5-1303-fda8-00000000280a 46400 1727204641.35983: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.51.0-1.el9" }, "changed": false } 46400 1727204641.36062: no more pending results, returning what we have 46400 1727204641.36069: results queue empty 46400 1727204641.36070: checking for any_errors_fatal 46400 1727204641.36083: done checking for any_errors_fatal 46400 1727204641.36084: checking for max_fail_percentage 46400 1727204641.36086: done checking for max_fail_percentage 46400 1727204641.36087: checking to see if all hosts have failed and the running result is not ok 46400 1727204641.36088: done checking to see if all hosts have failed 46400 1727204641.36088: getting the remaining hosts for this loop 46400 1727204641.36090: done getting the remaining hosts for this loop 46400 1727204641.36095: getting the next task for host managed-node2 46400 1727204641.36104: done getting next task for host managed-node2 46400 1727204641.36106: ^ task is: TASK: Show NetworkManager version 46400 1727204641.36110: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204641.36115: getting variables 46400 1727204641.36117: in VariableManager get_vars() 46400 1727204641.36173: Calling all_inventory to load vars for managed-node2 46400 1727204641.36176: Calling groups_inventory to load vars for managed-node2 46400 1727204641.36179: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.36190: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.36193: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.36195: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.38324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.40091: done with get_vars() 46400 1727204641.40118: done getting variables 46400 1727204641.40168: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.067) 0:02:11.686 ***** 46400 1727204641.40196: entering _queue_task() for managed-node2/debug 46400 1727204641.40457: worker is 1 (out of 1 available) 46400 1727204641.40475: exiting _queue_task() for managed-node2/debug 46400 1727204641.40488: done queuing things up, now waiting for results queue to drain 46400 1727204641.40490: waiting for pending results... 46400 1727204641.40685: running TaskExecutor() for managed-node2/TASK: Show NetworkManager version 46400 1727204641.40774: in run() - task 0affcd87-79f5-1303-fda8-00000000280b 46400 1727204641.40788: variable 'ansible_search_path' from source: unknown 46400 1727204641.40792: variable 'ansible_search_path' from source: unknown 46400 1727204641.40819: calling self._execute() 46400 1727204641.40903: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.40907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.40915: variable 'omit' from source: magic vars 46400 1727204641.41197: variable 'ansible_distribution_major_version' from source: facts 46400 1727204641.41208: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204641.41216: variable 'omit' from source: magic vars 46400 1727204641.41257: variable 'omit' from source: magic vars 46400 1727204641.41284: variable 'omit' from source: magic vars 46400 1727204641.41321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204641.41349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204641.41372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204641.41384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.41393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.41416: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204641.41421: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.41423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.41495: Set connection var ansible_shell_type to sh 46400 1727204641.41503: Set connection var ansible_shell_executable to /bin/sh 46400 1727204641.41508: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204641.41513: Set connection var ansible_connection to ssh 46400 1727204641.41518: Set connection var ansible_pipelining to False 46400 1727204641.41523: Set connection var ansible_timeout to 10 46400 1727204641.41545: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.41548: variable 'ansible_connection' from source: unknown 46400 1727204641.41550: variable 'ansible_module_compression' from source: unknown 46400 1727204641.41553: variable 'ansible_shell_type' from source: unknown 46400 1727204641.41555: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.41557: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.41559: variable 'ansible_pipelining' from source: unknown 46400 1727204641.41570: variable 'ansible_timeout' from source: unknown 46400 1727204641.41572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.41780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204641.41784: variable 'omit' from source: magic vars 46400 1727204641.41787: starting attempt loop 46400 1727204641.41791: running the handler 46400 1727204641.42319: variable 'networkmanager_nvr' from source: set_fact 46400 1727204641.42322: variable 'networkmanager_nvr' from source: set_fact 46400 1727204641.42324: handler run complete 46400 1727204641.42326: attempt loop complete, returning result 46400 1727204641.42327: _execute() done 46400 1727204641.42329: dumping result to json 46400 1727204641.42331: done dumping result, returning 46400 1727204641.42333: done running TaskExecutor() for managed-node2/TASK: Show NetworkManager version [0affcd87-79f5-1303-fda8-00000000280b] 46400 1727204641.42335: sending task result for task 0affcd87-79f5-1303-fda8-00000000280b 46400 1727204641.42404: done sending task result for task 0affcd87-79f5-1303-fda8-00000000280b 46400 1727204641.42407: WORKER PROCESS EXITING ok: [managed-node2] => { "networkmanager_nvr": "NetworkManager-1.51.0-1.el9" } 46400 1727204641.42457: no more pending results, returning what we have 46400 1727204641.42460: results queue empty 46400 1727204641.42461: checking for any_errors_fatal 46400 1727204641.42468: done checking for any_errors_fatal 46400 1727204641.42469: checking for max_fail_percentage 46400 1727204641.42471: done checking for max_fail_percentage 46400 1727204641.42472: checking to see if all hosts have failed and the running result is not ok 46400 1727204641.42473: done checking to see if all hosts have failed 46400 1727204641.42473: getting the remaining hosts for this loop 46400 1727204641.42475: done getting the remaining hosts for this loop 46400 1727204641.42481: getting the next task for host managed-node2 46400 1727204641.42493: done getting next task for host managed-node2 46400 1727204641.42496: ^ task is: TASK: Conditional asserts 46400 1727204641.42499: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204641.42503: getting variables 46400 1727204641.42504: in VariableManager get_vars() 46400 1727204641.42548: Calling all_inventory to load vars for managed-node2 46400 1727204641.42551: Calling groups_inventory to load vars for managed-node2 46400 1727204641.42555: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.42566: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.42569: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.42572: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.44111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.45090: done with get_vars() 46400 1727204641.45117: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.049) 0:02:11.736 ***** 46400 1727204641.45194: entering _queue_task() for managed-node2/include_tasks 46400 1727204641.45459: worker is 1 (out of 1 available) 46400 1727204641.45477: exiting _queue_task() for managed-node2/include_tasks 46400 1727204641.45490: done queuing things up, now waiting for results queue to drain 46400 1727204641.45492: waiting for pending results... 46400 1727204641.45686: running TaskExecutor() for managed-node2/TASK: Conditional asserts 46400 1727204641.45767: in run() - task 0affcd87-79f5-1303-fda8-0000000020b3 46400 1727204641.45777: variable 'ansible_search_path' from source: unknown 46400 1727204641.45781: variable 'ansible_search_path' from source: unknown 46400 1727204641.46116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 46400 1727204641.48500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 46400 1727204641.48551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 46400 1727204641.48584: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 46400 1727204641.48616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 46400 1727204641.48637: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 46400 1727204641.48702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 46400 1727204641.48726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 46400 1727204641.48744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 46400 1727204641.48774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 46400 1727204641.48785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 46400 1727204641.48868: variable 'lsr_assert_when' from source: include params 46400 1727204641.48955: variable 'network_provider' from source: set_fact 46400 1727204641.49016: variable 'omit' from source: magic vars 46400 1727204641.49090: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.49097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.49105: variable 'omit' from source: magic vars 46400 1727204641.49267: variable 'ansible_distribution_major_version' from source: facts 46400 1727204641.49273: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204641.49351: variable 'item' from source: unknown 46400 1727204641.49359: Evaluated conditional (item['condition']): True 46400 1727204641.49419: variable 'item' from source: unknown 46400 1727204641.49447: variable 'item' from source: unknown 46400 1727204641.49496: variable 'item' from source: unknown 46400 1727204641.49641: dumping result to json 46400 1727204641.49643: done dumping result, returning 46400 1727204641.49645: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [0affcd87-79f5-1303-fda8-0000000020b3] 46400 1727204641.49647: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b3 46400 1727204641.49689: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b3 46400 1727204641.49692: WORKER PROCESS EXITING 46400 1727204641.49717: no more pending results, returning what we have 46400 1727204641.49722: in VariableManager get_vars() 46400 1727204641.49785: Calling all_inventory to load vars for managed-node2 46400 1727204641.49788: Calling groups_inventory to load vars for managed-node2 46400 1727204641.49792: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.49804: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.49807: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.49809: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.50866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.51793: done with get_vars() 46400 1727204641.51814: variable 'ansible_search_path' from source: unknown 46400 1727204641.51815: variable 'ansible_search_path' from source: unknown 46400 1727204641.51846: we have included files to process 46400 1727204641.51846: generating all_blocks data 46400 1727204641.51848: done generating all_blocks data 46400 1727204641.51852: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204641.51853: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204641.51854: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 46400 1727204641.51938: in VariableManager get_vars() 46400 1727204641.51955: done with get_vars() 46400 1727204641.52042: done processing included file 46400 1727204641.52044: iterating over new_blocks loaded from include file 46400 1727204641.52045: in VariableManager get_vars() 46400 1727204641.52058: done with get_vars() 46400 1727204641.52059: filtering new block on tags 46400 1727204641.52086: done filtering new block on tags 46400 1727204641.52087: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 46400 1727204641.52092: extending task lists for all hosts with included blocks 46400 1727204641.52934: done extending task lists 46400 1727204641.52936: done processing included files 46400 1727204641.52936: results queue empty 46400 1727204641.52937: checking for any_errors_fatal 46400 1727204641.52940: done checking for any_errors_fatal 46400 1727204641.52940: checking for max_fail_percentage 46400 1727204641.52941: done checking for max_fail_percentage 46400 1727204641.52941: checking to see if all hosts have failed and the running result is not ok 46400 1727204641.52942: done checking to see if all hosts have failed 46400 1727204641.52943: getting the remaining hosts for this loop 46400 1727204641.52944: done getting the remaining hosts for this loop 46400 1727204641.52945: getting the next task for host managed-node2 46400 1727204641.52948: done getting next task for host managed-node2 46400 1727204641.52950: ^ task is: TASK: Include the task 'get_interface_stat.yml' 46400 1727204641.52952: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204641.52961: getting variables 46400 1727204641.52962: in VariableManager get_vars() 46400 1727204641.52977: Calling all_inventory to load vars for managed-node2 46400 1727204641.52979: Calling groups_inventory to load vars for managed-node2 46400 1727204641.52981: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.52987: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.52988: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.52990: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.53755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.59592: done with get_vars() 46400 1727204641.59616: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.144) 0:02:11.881 ***** 46400 1727204641.59681: entering _queue_task() for managed-node2/include_tasks 46400 1727204641.59944: worker is 1 (out of 1 available) 46400 1727204641.59959: exiting _queue_task() for managed-node2/include_tasks 46400 1727204641.59981: done queuing things up, now waiting for results queue to drain 46400 1727204641.59984: waiting for pending results... 46400 1727204641.60197: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 46400 1727204641.60297: in run() - task 0affcd87-79f5-1303-fda8-0000000028d3 46400 1727204641.60308: variable 'ansible_search_path' from source: unknown 46400 1727204641.60313: variable 'ansible_search_path' from source: unknown 46400 1727204641.60343: calling self._execute() 46400 1727204641.60431: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.60435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.60448: variable 'omit' from source: magic vars 46400 1727204641.60739: variable 'ansible_distribution_major_version' from source: facts 46400 1727204641.60750: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204641.60756: _execute() done 46400 1727204641.60759: dumping result to json 46400 1727204641.60765: done dumping result, returning 46400 1727204641.60772: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1303-fda8-0000000028d3] 46400 1727204641.60779: sending task result for task 0affcd87-79f5-1303-fda8-0000000028d3 46400 1727204641.60876: done sending task result for task 0affcd87-79f5-1303-fda8-0000000028d3 46400 1727204641.60880: WORKER PROCESS EXITING 46400 1727204641.61145: no more pending results, returning what we have 46400 1727204641.61151: in VariableManager get_vars() 46400 1727204641.61199: Calling all_inventory to load vars for managed-node2 46400 1727204641.61202: Calling groups_inventory to load vars for managed-node2 46400 1727204641.61206: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.61217: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.61220: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.61223: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.62503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.63451: done with get_vars() 46400 1727204641.63476: variable 'ansible_search_path' from source: unknown 46400 1727204641.63477: variable 'ansible_search_path' from source: unknown 46400 1727204641.63594: variable 'item' from source: include params 46400 1727204641.63620: we have included files to process 46400 1727204641.63621: generating all_blocks data 46400 1727204641.63623: done generating all_blocks data 46400 1727204641.63624: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204641.63625: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204641.63626: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 46400 1727204641.63768: done processing included file 46400 1727204641.63770: iterating over new_blocks loaded from include file 46400 1727204641.63771: in VariableManager get_vars() 46400 1727204641.63788: done with get_vars() 46400 1727204641.63789: filtering new block on tags 46400 1727204641.63810: done filtering new block on tags 46400 1727204641.63811: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 46400 1727204641.63815: extending task lists for all hosts with included blocks 46400 1727204641.63921: done extending task lists 46400 1727204641.63922: done processing included files 46400 1727204641.63922: results queue empty 46400 1727204641.63923: checking for any_errors_fatal 46400 1727204641.63927: done checking for any_errors_fatal 46400 1727204641.63927: checking for max_fail_percentage 46400 1727204641.63928: done checking for max_fail_percentage 46400 1727204641.63928: checking to see if all hosts have failed and the running result is not ok 46400 1727204641.63929: done checking to see if all hosts have failed 46400 1727204641.63929: getting the remaining hosts for this loop 46400 1727204641.63930: done getting the remaining hosts for this loop 46400 1727204641.63932: getting the next task for host managed-node2 46400 1727204641.63935: done getting next task for host managed-node2 46400 1727204641.63937: ^ task is: TASK: Get stat for interface {{ interface }} 46400 1727204641.63940: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204641.63941: getting variables 46400 1727204641.63942: in VariableManager get_vars() 46400 1727204641.63951: Calling all_inventory to load vars for managed-node2 46400 1727204641.63953: Calling groups_inventory to load vars for managed-node2 46400 1727204641.63954: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204641.63958: Calling all_plugins_play to load vars for managed-node2 46400 1727204641.63962: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204641.63966: Calling groups_plugins_play to load vars for managed-node2 46400 1727204641.65247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204641.66978: done with get_vars() 46400 1727204641.67013: done getting variables 46400 1727204641.67158: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.075) 0:02:11.956 ***** 46400 1727204641.67196: entering _queue_task() for managed-node2/stat 46400 1727204641.67569: worker is 1 (out of 1 available) 46400 1727204641.67583: exiting _queue_task() for managed-node2/stat 46400 1727204641.67596: done queuing things up, now waiting for results queue to drain 46400 1727204641.67598: waiting for pending results... 46400 1727204641.67920: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 46400 1727204641.68075: in run() - task 0affcd87-79f5-1303-fda8-000000002979 46400 1727204641.68089: variable 'ansible_search_path' from source: unknown 46400 1727204641.68093: variable 'ansible_search_path' from source: unknown 46400 1727204641.68128: calling self._execute() 46400 1727204641.68239: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.68243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.68257: variable 'omit' from source: magic vars 46400 1727204641.68642: variable 'ansible_distribution_major_version' from source: facts 46400 1727204641.68654: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204641.68660: variable 'omit' from source: magic vars 46400 1727204641.68727: variable 'omit' from source: magic vars 46400 1727204641.68831: variable 'interface' from source: play vars 46400 1727204641.68850: variable 'omit' from source: magic vars 46400 1727204641.68900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204641.68939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204641.68961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204641.68986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.68999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204641.69033: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204641.69037: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.69040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.69144: Set connection var ansible_shell_type to sh 46400 1727204641.69154: Set connection var ansible_shell_executable to /bin/sh 46400 1727204641.69159: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204641.69170: Set connection var ansible_connection to ssh 46400 1727204641.69175: Set connection var ansible_pipelining to False 46400 1727204641.69181: Set connection var ansible_timeout to 10 46400 1727204641.69207: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.69210: variable 'ansible_connection' from source: unknown 46400 1727204641.69213: variable 'ansible_module_compression' from source: unknown 46400 1727204641.69215: variable 'ansible_shell_type' from source: unknown 46400 1727204641.69217: variable 'ansible_shell_executable' from source: unknown 46400 1727204641.69220: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204641.69224: variable 'ansible_pipelining' from source: unknown 46400 1727204641.69228: variable 'ansible_timeout' from source: unknown 46400 1727204641.69236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204641.69444: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 46400 1727204641.69458: variable 'omit' from source: magic vars 46400 1727204641.69467: starting attempt loop 46400 1727204641.69470: running the handler 46400 1727204641.69485: _low_level_execute_command(): starting 46400 1727204641.69492: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204641.70283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204641.70296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.70308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.70324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.70372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.70380: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204641.70390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.70404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204641.70412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204641.70419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204641.70427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.70439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.70455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.70467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.70475: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204641.70485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.70561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.70582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.70593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.70669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.72332: stdout chunk (state=3): >>>/root <<< 46400 1727204641.72512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204641.72516: stderr chunk (state=3): >>><<< 46400 1727204641.72522: stdout chunk (state=3): >>><<< 46400 1727204641.72556: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204641.72576: _low_level_execute_command(): starting 46400 1727204641.72581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388 `" && echo ansible-tmp-1727204641.725541-55570-243888128292388="` echo /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388 `" ) && sleep 0' 46400 1727204641.73330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204641.73342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.73353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.73373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.73413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.73422: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204641.73436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.73454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204641.73470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204641.73473: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204641.73483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.73492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.73507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.73509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.73518: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204641.73530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.73609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.73624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.73627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.73707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.75551: stdout chunk (state=3): >>>ansible-tmp-1727204641.725541-55570-243888128292388=/root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388 <<< 46400 1727204641.75667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204641.75729: stderr chunk (state=3): >>><<< 46400 1727204641.75732: stdout chunk (state=3): >>><<< 46400 1727204641.75750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204641.725541-55570-243888128292388=/root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204641.75793: variable 'ansible_module_compression' from source: unknown 46400 1727204641.75847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 46400 1727204641.75893: variable 'ansible_facts' from source: unknown 46400 1727204641.75977: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/AnsiballZ_stat.py 46400 1727204641.76124: Sending initial data 46400 1727204641.76128: Sent initial data (152 bytes) 46400 1727204641.77050: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204641.77069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.77073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.77084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.77123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.77129: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204641.77151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.77154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204641.77157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204641.77171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204641.77174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.77192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.77195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.77213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.77216: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204641.77218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.77296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.77314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.77320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.77396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.79100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204641.79132: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204641.79174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp6g7j1uij /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/AnsiballZ_stat.py <<< 46400 1727204641.79204: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204641.80147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204641.80419: stderr chunk (state=3): >>><<< 46400 1727204641.80423: stdout chunk (state=3): >>><<< 46400 1727204641.80425: done transferring module to remote 46400 1727204641.80427: _low_level_execute_command(): starting 46400 1727204641.80434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/ /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/AnsiballZ_stat.py && sleep 0' 46400 1727204641.80998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204641.81013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.81028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.81045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.81089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.81103: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204641.81117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.81134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204641.81144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204641.81154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204641.81166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.81179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.81194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.81208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204641.81219: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204641.81232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.81309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.81331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.81348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.81415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.83118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204641.83175: stderr chunk (state=3): >>><<< 46400 1727204641.83181: stdout chunk (state=3): >>><<< 46400 1727204641.83196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204641.83199: _low_level_execute_command(): starting 46400 1727204641.83204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/AnsiballZ_stat.py && sleep 0' 46400 1727204641.84093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.84097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204641.97267: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 46400 1727204641.98242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204641.98304: stderr chunk (state=3): >>><<< 46400 1727204641.98308: stdout chunk (state=3): >>><<< 46400 1727204641.98326: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204641.98348: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204641.98358: _low_level_execute_command(): starting 46400 1727204641.98368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204641.725541-55570-243888128292388/ > /dev/null 2>&1 && sleep 0' 46400 1727204641.98840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204641.98845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204641.98898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.98902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204641.98905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204641.98967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204641.98971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204641.98974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204641.99022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.00819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.00883: stderr chunk (state=3): >>><<< 46400 1727204642.00887: stdout chunk (state=3): >>><<< 46400 1727204642.00901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.00907: handler run complete 46400 1727204642.00925: attempt loop complete, returning result 46400 1727204642.00928: _execute() done 46400 1727204642.00930: dumping result to json 46400 1727204642.00932: done dumping result, returning 46400 1727204642.00940: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [0affcd87-79f5-1303-fda8-000000002979] 46400 1727204642.00945: sending task result for task 0affcd87-79f5-1303-fda8-000000002979 46400 1727204642.01051: done sending task result for task 0affcd87-79f5-1303-fda8-000000002979 46400 1727204642.01056: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 46400 1727204642.01120: no more pending results, returning what we have 46400 1727204642.01125: results queue empty 46400 1727204642.01126: checking for any_errors_fatal 46400 1727204642.01128: done checking for any_errors_fatal 46400 1727204642.01128: checking for max_fail_percentage 46400 1727204642.01130: done checking for max_fail_percentage 46400 1727204642.01131: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.01132: done checking to see if all hosts have failed 46400 1727204642.01132: getting the remaining hosts for this loop 46400 1727204642.01134: done getting the remaining hosts for this loop 46400 1727204642.01138: getting the next task for host managed-node2 46400 1727204642.01150: done getting next task for host managed-node2 46400 1727204642.01152: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 46400 1727204642.01157: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.01167: getting variables 46400 1727204642.01168: in VariableManager get_vars() 46400 1727204642.01217: Calling all_inventory to load vars for managed-node2 46400 1727204642.01220: Calling groups_inventory to load vars for managed-node2 46400 1727204642.01223: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.01235: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.01237: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.01240: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.02126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.03076: done with get_vars() 46400 1727204642.03101: done getting variables 46400 1727204642.03149: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204642.03250: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.360) 0:02:12.317 ***** 46400 1727204642.03280: entering _queue_task() for managed-node2/assert 46400 1727204642.03544: worker is 1 (out of 1 available) 46400 1727204642.03562: exiting _queue_task() for managed-node2/assert 46400 1727204642.03578: done queuing things up, now waiting for results queue to drain 46400 1727204642.03580: waiting for pending results... 46400 1727204642.03777: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 46400 1727204642.03858: in run() - task 0affcd87-79f5-1303-fda8-0000000028d4 46400 1727204642.03873: variable 'ansible_search_path' from source: unknown 46400 1727204642.03877: variable 'ansible_search_path' from source: unknown 46400 1727204642.03908: calling self._execute() 46400 1727204642.03994: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.04000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.04009: variable 'omit' from source: magic vars 46400 1727204642.04294: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.04307: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.04310: variable 'omit' from source: magic vars 46400 1727204642.04349: variable 'omit' from source: magic vars 46400 1727204642.04422: variable 'interface' from source: play vars 46400 1727204642.04441: variable 'omit' from source: magic vars 46400 1727204642.04478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204642.04507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204642.04526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204642.04540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.04551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.04579: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204642.04582: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.04585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.04749: Set connection var ansible_shell_type to sh 46400 1727204642.04752: Set connection var ansible_shell_executable to /bin/sh 46400 1727204642.04755: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204642.04757: Set connection var ansible_connection to ssh 46400 1727204642.04762: Set connection var ansible_pipelining to False 46400 1727204642.04766: Set connection var ansible_timeout to 10 46400 1727204642.04769: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.04772: variable 'ansible_connection' from source: unknown 46400 1727204642.04775: variable 'ansible_module_compression' from source: unknown 46400 1727204642.04777: variable 'ansible_shell_type' from source: unknown 46400 1727204642.04779: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.04781: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.04784: variable 'ansible_pipelining' from source: unknown 46400 1727204642.04786: variable 'ansible_timeout' from source: unknown 46400 1727204642.04788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.04865: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.04877: variable 'omit' from source: magic vars 46400 1727204642.04882: starting attempt loop 46400 1727204642.04886: running the handler 46400 1727204642.05032: variable 'interface_stat' from source: set_fact 46400 1727204642.05042: Evaluated conditional (not interface_stat.stat.exists): True 46400 1727204642.05049: handler run complete 46400 1727204642.05070: attempt loop complete, returning result 46400 1727204642.05073: _execute() done 46400 1727204642.05077: dumping result to json 46400 1727204642.05080: done dumping result, returning 46400 1727204642.05083: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [0affcd87-79f5-1303-fda8-0000000028d4] 46400 1727204642.05085: sending task result for task 0affcd87-79f5-1303-fda8-0000000028d4 46400 1727204642.05183: done sending task result for task 0affcd87-79f5-1303-fda8-0000000028d4 46400 1727204642.05186: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 46400 1727204642.05234: no more pending results, returning what we have 46400 1727204642.05240: results queue empty 46400 1727204642.05241: checking for any_errors_fatal 46400 1727204642.05255: done checking for any_errors_fatal 46400 1727204642.05256: checking for max_fail_percentage 46400 1727204642.05257: done checking for max_fail_percentage 46400 1727204642.05258: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.05259: done checking to see if all hosts have failed 46400 1727204642.05262: getting the remaining hosts for this loop 46400 1727204642.05265: done getting the remaining hosts for this loop 46400 1727204642.05269: getting the next task for host managed-node2 46400 1727204642.05279: done getting next task for host managed-node2 46400 1727204642.05282: ^ task is: TASK: Success in test '{{ lsr_description }}' 46400 1727204642.05285: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.05290: getting variables 46400 1727204642.05291: in VariableManager get_vars() 46400 1727204642.05344: Calling all_inventory to load vars for managed-node2 46400 1727204642.05347: Calling groups_inventory to load vars for managed-node2 46400 1727204642.05350: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.05365: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.05368: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.05371: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.07114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.08614: done with get_vars() 46400 1727204642.08642: done getting variables 46400 1727204642.08694: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 46400 1727204642.08791: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.055) 0:02:12.372 ***** 46400 1727204642.08819: entering _queue_task() for managed-node2/debug 46400 1727204642.09075: worker is 1 (out of 1 available) 46400 1727204642.09091: exiting _queue_task() for managed-node2/debug 46400 1727204642.09103: done queuing things up, now waiting for results queue to drain 46400 1727204642.09105: waiting for pending results... 46400 1727204642.09306: running TaskExecutor() for managed-node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 46400 1727204642.09396: in run() - task 0affcd87-79f5-1303-fda8-0000000020b4 46400 1727204642.09409: variable 'ansible_search_path' from source: unknown 46400 1727204642.09412: variable 'ansible_search_path' from source: unknown 46400 1727204642.09442: calling self._execute() 46400 1727204642.09528: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.09532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.09543: variable 'omit' from source: magic vars 46400 1727204642.09834: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.09844: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.09850: variable 'omit' from source: magic vars 46400 1727204642.09885: variable 'omit' from source: magic vars 46400 1727204642.09957: variable 'lsr_description' from source: include params 46400 1727204642.09977: variable 'omit' from source: magic vars 46400 1727204642.10017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204642.10041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204642.10059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204642.10077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.10088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.10111: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204642.10114: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.10119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.10188: Set connection var ansible_shell_type to sh 46400 1727204642.10196: Set connection var ansible_shell_executable to /bin/sh 46400 1727204642.10200: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204642.10206: Set connection var ansible_connection to ssh 46400 1727204642.10211: Set connection var ansible_pipelining to False 46400 1727204642.10216: Set connection var ansible_timeout to 10 46400 1727204642.10238: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.10241: variable 'ansible_connection' from source: unknown 46400 1727204642.10244: variable 'ansible_module_compression' from source: unknown 46400 1727204642.10246: variable 'ansible_shell_type' from source: unknown 46400 1727204642.10248: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.10250: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.10253: variable 'ansible_pipelining' from source: unknown 46400 1727204642.10256: variable 'ansible_timeout' from source: unknown 46400 1727204642.10261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.10372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.10381: variable 'omit' from source: magic vars 46400 1727204642.10386: starting attempt loop 46400 1727204642.10389: running the handler 46400 1727204642.10422: handler run complete 46400 1727204642.10434: attempt loop complete, returning result 46400 1727204642.10437: _execute() done 46400 1727204642.10440: dumping result to json 46400 1727204642.10443: done dumping result, returning 46400 1727204642.10453: done running TaskExecutor() for managed-node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [0affcd87-79f5-1303-fda8-0000000020b4] 46400 1727204642.10455: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b4 46400 1727204642.10545: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b4 46400 1727204642.10547: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 46400 1727204642.10600: no more pending results, returning what we have 46400 1727204642.10604: results queue empty 46400 1727204642.10605: checking for any_errors_fatal 46400 1727204642.10614: done checking for any_errors_fatal 46400 1727204642.10615: checking for max_fail_percentage 46400 1727204642.10616: done checking for max_fail_percentage 46400 1727204642.10617: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.10618: done checking to see if all hosts have failed 46400 1727204642.10619: getting the remaining hosts for this loop 46400 1727204642.10620: done getting the remaining hosts for this loop 46400 1727204642.10624: getting the next task for host managed-node2 46400 1727204642.10633: done getting next task for host managed-node2 46400 1727204642.10637: ^ task is: TASK: Cleanup 46400 1727204642.10640: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.10646: getting variables 46400 1727204642.10648: in VariableManager get_vars() 46400 1727204642.10703: Calling all_inventory to load vars for managed-node2 46400 1727204642.10706: Calling groups_inventory to load vars for managed-node2 46400 1727204642.10709: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.10720: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.10723: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.10725: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.11594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.12681: done with get_vars() 46400 1727204642.12700: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.039) 0:02:12.412 ***** 46400 1727204642.12777: entering _queue_task() for managed-node2/include_tasks 46400 1727204642.13035: worker is 1 (out of 1 available) 46400 1727204642.13048: exiting _queue_task() for managed-node2/include_tasks 46400 1727204642.13061: done queuing things up, now waiting for results queue to drain 46400 1727204642.13065: waiting for pending results... 46400 1727204642.13269: running TaskExecutor() for managed-node2/TASK: Cleanup 46400 1727204642.13352: in run() - task 0affcd87-79f5-1303-fda8-0000000020b8 46400 1727204642.13367: variable 'ansible_search_path' from source: unknown 46400 1727204642.13371: variable 'ansible_search_path' from source: unknown 46400 1727204642.13411: variable 'lsr_cleanup' from source: include params 46400 1727204642.13577: variable 'lsr_cleanup' from source: include params 46400 1727204642.13637: variable 'omit' from source: magic vars 46400 1727204642.13748: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.13755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.13766: variable 'omit' from source: magic vars 46400 1727204642.13942: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.13950: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.13956: variable 'item' from source: unknown 46400 1727204642.14008: variable 'item' from source: unknown 46400 1727204642.14034: variable 'item' from source: unknown 46400 1727204642.14087: variable 'item' from source: unknown 46400 1727204642.14211: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.14215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.14217: variable 'omit' from source: magic vars 46400 1727204642.14302: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.14306: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.14313: variable 'item' from source: unknown 46400 1727204642.14359: variable 'item' from source: unknown 46400 1727204642.14383: variable 'item' from source: unknown 46400 1727204642.14427: variable 'item' from source: unknown 46400 1727204642.14500: dumping result to json 46400 1727204642.14503: done dumping result, returning 46400 1727204642.14505: done running TaskExecutor() for managed-node2/TASK: Cleanup [0affcd87-79f5-1303-fda8-0000000020b8] 46400 1727204642.14507: sending task result for task 0affcd87-79f5-1303-fda8-0000000020b8 46400 1727204642.14539: done sending task result for task 0affcd87-79f5-1303-fda8-0000000020b8 46400 1727204642.14541: WORKER PROCESS EXITING 46400 1727204642.14575: no more pending results, returning what we have 46400 1727204642.14580: in VariableManager get_vars() 46400 1727204642.14633: Calling all_inventory to load vars for managed-node2 46400 1727204642.14636: Calling groups_inventory to load vars for managed-node2 46400 1727204642.14640: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.14664: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.14670: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.14674: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.15559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.16497: done with get_vars() 46400 1727204642.16520: variable 'ansible_search_path' from source: unknown 46400 1727204642.16521: variable 'ansible_search_path' from source: unknown 46400 1727204642.16551: variable 'ansible_search_path' from source: unknown 46400 1727204642.16552: variable 'ansible_search_path' from source: unknown 46400 1727204642.16573: we have included files to process 46400 1727204642.16574: generating all_blocks data 46400 1727204642.16575: done generating all_blocks data 46400 1727204642.16578: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204642.16579: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204642.16580: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 46400 1727204642.16724: done processing included file 46400 1727204642.16725: iterating over new_blocks loaded from include file 46400 1727204642.16726: in VariableManager get_vars() 46400 1727204642.16741: done with get_vars() 46400 1727204642.16742: filtering new block on tags 46400 1727204642.16760: done filtering new block on tags 46400 1727204642.16762: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 46400 1727204642.16767: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 46400 1727204642.16768: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 46400 1727204642.16770: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 46400 1727204642.17004: done processing included file 46400 1727204642.17005: iterating over new_blocks loaded from include file 46400 1727204642.17006: in VariableManager get_vars() 46400 1727204642.17017: done with get_vars() 46400 1727204642.17018: filtering new block on tags 46400 1727204642.17038: done filtering new block on tags 46400 1727204642.17039: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 => (item=tasks/check_network_dns.yml) 46400 1727204642.17042: extending task lists for all hosts with included blocks 46400 1727204642.18021: done extending task lists 46400 1727204642.18023: done processing included files 46400 1727204642.18023: results queue empty 46400 1727204642.18024: checking for any_errors_fatal 46400 1727204642.18028: done checking for any_errors_fatal 46400 1727204642.18029: checking for max_fail_percentage 46400 1727204642.18030: done checking for max_fail_percentage 46400 1727204642.18030: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.18031: done checking to see if all hosts have failed 46400 1727204642.18031: getting the remaining hosts for this loop 46400 1727204642.18032: done getting the remaining hosts for this loop 46400 1727204642.18034: getting the next task for host managed-node2 46400 1727204642.18037: done getting next task for host managed-node2 46400 1727204642.18039: ^ task is: TASK: Cleanup profile and device 46400 1727204642.18042: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.18044: getting variables 46400 1727204642.18044: in VariableManager get_vars() 46400 1727204642.18054: Calling all_inventory to load vars for managed-node2 46400 1727204642.18062: Calling groups_inventory to load vars for managed-node2 46400 1727204642.18065: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.18070: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.18072: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.18074: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.18829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.19748: done with get_vars() 46400 1727204642.19773: done getting variables 46400 1727204642.19809: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.070) 0:02:12.482 ***** 46400 1727204642.19833: entering _queue_task() for managed-node2/shell 46400 1727204642.20101: worker is 1 (out of 1 available) 46400 1727204642.20115: exiting _queue_task() for managed-node2/shell 46400 1727204642.20130: done queuing things up, now waiting for results queue to drain 46400 1727204642.20132: waiting for pending results... 46400 1727204642.20332: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 46400 1727204642.20414: in run() - task 0affcd87-79f5-1303-fda8-00000000299e 46400 1727204642.20425: variable 'ansible_search_path' from source: unknown 46400 1727204642.20430: variable 'ansible_search_path' from source: unknown 46400 1727204642.20458: calling self._execute() 46400 1727204642.20544: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.20548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.20556: variable 'omit' from source: magic vars 46400 1727204642.20850: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.20861: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.20870: variable 'omit' from source: magic vars 46400 1727204642.20905: variable 'omit' from source: magic vars 46400 1727204642.21018: variable 'interface' from source: play vars 46400 1727204642.21034: variable 'omit' from source: magic vars 46400 1727204642.21075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204642.21102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204642.21122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204642.21135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.21144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.21175: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204642.21178: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.21181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.21247: Set connection var ansible_shell_type to sh 46400 1727204642.21255: Set connection var ansible_shell_executable to /bin/sh 46400 1727204642.21260: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204642.21270: Set connection var ansible_connection to ssh 46400 1727204642.21274: Set connection var ansible_pipelining to False 46400 1727204642.21280: Set connection var ansible_timeout to 10 46400 1727204642.21300: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.21303: variable 'ansible_connection' from source: unknown 46400 1727204642.21305: variable 'ansible_module_compression' from source: unknown 46400 1727204642.21308: variable 'ansible_shell_type' from source: unknown 46400 1727204642.21310: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.21312: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.21314: variable 'ansible_pipelining' from source: unknown 46400 1727204642.21318: variable 'ansible_timeout' from source: unknown 46400 1727204642.21322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.21429: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.21440: variable 'omit' from source: magic vars 46400 1727204642.21443: starting attempt loop 46400 1727204642.21446: running the handler 46400 1727204642.21456: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.21477: _low_level_execute_command(): starting 46400 1727204642.21485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204642.22018: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.22028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.22068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.22079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.22085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204642.22097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.22152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.22160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.22177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.22225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.23886: stdout chunk (state=3): >>>/root <<< 46400 1727204642.23986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.24050: stderr chunk (state=3): >>><<< 46400 1727204642.24053: stdout chunk (state=3): >>><<< 46400 1727204642.24080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.24093: _low_level_execute_command(): starting 46400 1727204642.24099: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459 `" && echo ansible-tmp-1727204642.2407978-55595-90969551231459="` echo /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459 `" ) && sleep 0' 46400 1727204642.24582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.24605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.24627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.24641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.24683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.24696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.24754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.26608: stdout chunk (state=3): >>>ansible-tmp-1727204642.2407978-55595-90969551231459=/root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459 <<< 46400 1727204642.26720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.26782: stderr chunk (state=3): >>><<< 46400 1727204642.26786: stdout chunk (state=3): >>><<< 46400 1727204642.26803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204642.2407978-55595-90969551231459=/root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.26833: variable 'ansible_module_compression' from source: unknown 46400 1727204642.26885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204642.26918: variable 'ansible_facts' from source: unknown 46400 1727204642.26986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/AnsiballZ_command.py 46400 1727204642.27099: Sending initial data 46400 1727204642.27103: Sent initial data (155 bytes) 46400 1727204642.27809: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.27815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.27851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204642.27857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 46400 1727204642.27871: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.27884: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.27890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.27941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.27953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.28008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.29743: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204642.29780: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204642.29820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmprryu2gq7 /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/AnsiballZ_command.py <<< 46400 1727204642.29854: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204642.30631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.30748: stderr chunk (state=3): >>><<< 46400 1727204642.30751: stdout chunk (state=3): >>><<< 46400 1727204642.30776: done transferring module to remote 46400 1727204642.30786: _low_level_execute_command(): starting 46400 1727204642.30791: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/ /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/AnsiballZ_command.py && sleep 0' 46400 1727204642.31259: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.31270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.31301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.31316: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.31327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.31383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.31394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.31443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.33163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.33224: stderr chunk (state=3): >>><<< 46400 1727204642.33228: stdout chunk (state=3): >>><<< 46400 1727204642.33250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.33254: _low_level_execute_command(): starting 46400 1727204642.33256: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/AnsiballZ_command.py && sleep 0' 46400 1727204642.33741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.33746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.33785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.33799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204642.33810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.33857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.33873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.33929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.50542: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:02.469555", "end": "2024-09-24 15:04:02.504338", "delta": "0:00:00.034783", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204642.51842: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. <<< 46400 1727204642.51846: stdout chunk (state=3): >>><<< 46400 1727204642.51848: stderr chunk (state=3): >>><<< 46400 1727204642.52001: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:02.469555", "end": "2024-09-24 15:04:02.504338", "delta": "0:00:00.034783", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.78 closed. 46400 1727204642.52005: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204642.52008: _low_level_execute_command(): starting 46400 1727204642.52011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204642.2407978-55595-90969551231459/ > /dev/null 2>&1 && sleep 0' 46400 1727204642.52602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.52617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.52631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.52652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.52702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.52714: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.52727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.52744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.52754: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.52769: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.52781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.52794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.52808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.52819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.52829: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.52843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.52919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.52937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.52954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.53034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.54887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.54942: stderr chunk (state=3): >>><<< 46400 1727204642.54945: stdout chunk (state=3): >>><<< 46400 1727204642.54967: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.54977: handler run complete 46400 1727204642.55005: Evaluated conditional (False): False 46400 1727204642.55015: attempt loop complete, returning result 46400 1727204642.55018: _execute() done 46400 1727204642.55021: dumping result to json 46400 1727204642.55027: done dumping result, returning 46400 1727204642.55037: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [0affcd87-79f5-1303-fda8-00000000299e] 46400 1727204642.55043: sending task result for task 0affcd87-79f5-1303-fda8-00000000299e 46400 1727204642.55153: done sending task result for task 0affcd87-79f5-1303-fda8-00000000299e 46400 1727204642.55156: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.034783", "end": "2024-09-24 15:04:02.504338", "rc": 1, "start": "2024-09-24 15:04:02.469555" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Cannot find device "statebr" MSG: non-zero return code ...ignoring 46400 1727204642.55235: no more pending results, returning what we have 46400 1727204642.55242: results queue empty 46400 1727204642.55244: checking for any_errors_fatal 46400 1727204642.55246: done checking for any_errors_fatal 46400 1727204642.55247: checking for max_fail_percentage 46400 1727204642.55248: done checking for max_fail_percentage 46400 1727204642.55250: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.55251: done checking to see if all hosts have failed 46400 1727204642.55251: getting the remaining hosts for this loop 46400 1727204642.55253: done getting the remaining hosts for this loop 46400 1727204642.55258: getting the next task for host managed-node2 46400 1727204642.55276: done getting next task for host managed-node2 46400 1727204642.55280: ^ task is: TASK: Check routes and DNS 46400 1727204642.55285: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.55291: getting variables 46400 1727204642.55293: in VariableManager get_vars() 46400 1727204642.55347: Calling all_inventory to load vars for managed-node2 46400 1727204642.55350: Calling groups_inventory to load vars for managed-node2 46400 1727204642.55354: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.55372: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.55375: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.55379: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.57193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.59045: done with get_vars() 46400 1727204642.59085: done getting variables 46400 1727204642.59154: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.393) 0:02:12.876 ***** 46400 1727204642.59194: entering _queue_task() for managed-node2/shell 46400 1727204642.59589: worker is 1 (out of 1 available) 46400 1727204642.59601: exiting _queue_task() for managed-node2/shell 46400 1727204642.59615: done queuing things up, now waiting for results queue to drain 46400 1727204642.59617: waiting for pending results... 46400 1727204642.59946: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 46400 1727204642.60074: in run() - task 0affcd87-79f5-1303-fda8-0000000029a2 46400 1727204642.60090: variable 'ansible_search_path' from source: unknown 46400 1727204642.60094: variable 'ansible_search_path' from source: unknown 46400 1727204642.60135: calling self._execute() 46400 1727204642.60241: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.60245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.60257: variable 'omit' from source: magic vars 46400 1727204642.60679: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.60692: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.60698: variable 'omit' from source: magic vars 46400 1727204642.60753: variable 'omit' from source: magic vars 46400 1727204642.60798: variable 'omit' from source: magic vars 46400 1727204642.60846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204642.60886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204642.60912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204642.60933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.60945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.60982: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204642.60986: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.60989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.61095: Set connection var ansible_shell_type to sh 46400 1727204642.61110: Set connection var ansible_shell_executable to /bin/sh 46400 1727204642.61115: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204642.61120: Set connection var ansible_connection to ssh 46400 1727204642.61126: Set connection var ansible_pipelining to False 46400 1727204642.61132: Set connection var ansible_timeout to 10 46400 1727204642.61161: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.61169: variable 'ansible_connection' from source: unknown 46400 1727204642.61173: variable 'ansible_module_compression' from source: unknown 46400 1727204642.61175: variable 'ansible_shell_type' from source: unknown 46400 1727204642.61178: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.61180: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.61185: variable 'ansible_pipelining' from source: unknown 46400 1727204642.61187: variable 'ansible_timeout' from source: unknown 46400 1727204642.61191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.61346: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.61358: variable 'omit' from source: magic vars 46400 1727204642.61374: starting attempt loop 46400 1727204642.61377: running the handler 46400 1727204642.61389: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.61410: _low_level_execute_command(): starting 46400 1727204642.61418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204642.62243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.62258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.62280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.62296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.62341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.62347: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.62357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.62377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.62391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.62399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.62407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.62422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.62435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.62443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.62451: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.62460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.62545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.62567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.62581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.62651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.64215: stdout chunk (state=3): >>>/root <<< 46400 1727204642.64429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.64433: stdout chunk (state=3): >>><<< 46400 1727204642.64435: stderr chunk (state=3): >>><<< 46400 1727204642.64574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.64578: _low_level_execute_command(): starting 46400 1727204642.64590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233 `" && echo ansible-tmp-1727204642.6446521-55606-253945549011233="` echo /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233 `" ) && sleep 0' 46400 1727204642.65232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.65250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.65269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.65288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.65335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.65353: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.65370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.65390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.65402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.65414: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.65426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.65440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.65463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.65480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.65493: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.65507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.65591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.65615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.65632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.65716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.67557: stdout chunk (state=3): >>>ansible-tmp-1727204642.6446521-55606-253945549011233=/root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233 <<< 46400 1727204642.67671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.67739: stderr chunk (state=3): >>><<< 46400 1727204642.67742: stdout chunk (state=3): >>><<< 46400 1727204642.67761: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204642.6446521-55606-253945549011233=/root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.67793: variable 'ansible_module_compression' from source: unknown 46400 1727204642.67838: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204642.67888: variable 'ansible_facts' from source: unknown 46400 1727204642.67983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/AnsiballZ_command.py 46400 1727204642.68112: Sending initial data 46400 1727204642.68115: Sent initial data (156 bytes) 46400 1727204642.69071: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.69086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.69089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.69110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.69139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.69143: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.69152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.69173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.69176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.69194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.69196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.69199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.69217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.69220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.69222: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.69232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.69322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.69326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.69344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.69401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.71125: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204642.71157: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204642.71196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmpjvd4ep3d /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/AnsiballZ_command.py <<< 46400 1727204642.71224: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204642.72296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.72480: stderr chunk (state=3): >>><<< 46400 1727204642.72484: stdout chunk (state=3): >>><<< 46400 1727204642.72507: done transferring module to remote 46400 1727204642.72518: _low_level_execute_command(): starting 46400 1727204642.72524: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/ /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/AnsiballZ_command.py && sleep 0' 46400 1727204642.73269: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.73280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.73291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.73306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.73356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.73367: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.73382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.73396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.73404: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.73411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.73420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.73430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.73443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.73458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.73470: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.73480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.73553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.73589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.73601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.73680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.75385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.75492: stderr chunk (state=3): >>><<< 46400 1727204642.75498: stdout chunk (state=3): >>><<< 46400 1727204642.75525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.75529: _low_level_execute_command(): starting 46400 1727204642.75535: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/AnsiballZ_command.py && sleep 0' 46400 1727204642.76258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204642.76272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.76293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.76307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.76349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.76356: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204642.76372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.76388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204642.76403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204642.76410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204642.76418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204642.76428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.76441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.76449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204642.76456: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204642.76476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.76557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.76582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204642.76594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.76685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.90683: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2914sec preferred_lft 2914sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:02.896886", "end": "2024-09-24 15:04:02.905589", "delta": "0:00:00.008703", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204642.91813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204642.91876: stderr chunk (state=3): >>><<< 46400 1727204642.91880: stdout chunk (state=3): >>><<< 46400 1727204642.91898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2914sec preferred_lft 2914sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:02.896886", "end": "2024-09-24 15:04:02.905589", "delta": "0:00:00.008703", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204642.91933: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204642.91941: _low_level_execute_command(): starting 46400 1727204642.91946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204642.6446521-55606-253945549011233/ > /dev/null 2>&1 && sleep 0' 46400 1727204642.92423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.92427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.92462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.92479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204642.92490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.92536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.92549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.92598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204642.94376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204642.94437: stderr chunk (state=3): >>><<< 46400 1727204642.94443: stdout chunk (state=3): >>><<< 46400 1727204642.94458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204642.94469: handler run complete 46400 1727204642.94488: Evaluated conditional (False): False 46400 1727204642.94496: attempt loop complete, returning result 46400 1727204642.94499: _execute() done 46400 1727204642.94501: dumping result to json 46400 1727204642.94507: done dumping result, returning 46400 1727204642.94514: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [0affcd87-79f5-1303-fda8-0000000029a2] 46400 1727204642.94521: sending task result for task 0affcd87-79f5-1303-fda8-0000000029a2 46400 1727204642.94626: done sending task result for task 0affcd87-79f5-1303-fda8-0000000029a2 46400 1727204642.94629: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008703", "end": "2024-09-24 15:04:02.905589", "rc": 0, "start": "2024-09-24 15:04:02.896886" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2914sec preferred_lft 2914sec inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 46400 1727204642.94718: no more pending results, returning what we have 46400 1727204642.94722: results queue empty 46400 1727204642.94723: checking for any_errors_fatal 46400 1727204642.94734: done checking for any_errors_fatal 46400 1727204642.94735: checking for max_fail_percentage 46400 1727204642.94736: done checking for max_fail_percentage 46400 1727204642.94737: checking to see if all hosts have failed and the running result is not ok 46400 1727204642.94740: done checking to see if all hosts have failed 46400 1727204642.94740: getting the remaining hosts for this loop 46400 1727204642.94742: done getting the remaining hosts for this loop 46400 1727204642.94746: getting the next task for host managed-node2 46400 1727204642.94754: done getting next task for host managed-node2 46400 1727204642.94757: ^ task is: TASK: Verify DNS and network connectivity 46400 1727204642.94761: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204642.94767: getting variables 46400 1727204642.94768: in VariableManager get_vars() 46400 1727204642.94811: Calling all_inventory to load vars for managed-node2 46400 1727204642.94814: Calling groups_inventory to load vars for managed-node2 46400 1727204642.94817: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204642.94829: Calling all_plugins_play to load vars for managed-node2 46400 1727204642.94831: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204642.94834: Calling groups_plugins_play to load vars for managed-node2 46400 1727204642.95845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204642.96772: done with get_vars() 46400 1727204642.96795: done getting variables 46400 1727204642.96842: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.376) 0:02:13.253 ***** 46400 1727204642.96871: entering _queue_task() for managed-node2/shell 46400 1727204642.97133: worker is 1 (out of 1 available) 46400 1727204642.97147: exiting _queue_task() for managed-node2/shell 46400 1727204642.97161: done queuing things up, now waiting for results queue to drain 46400 1727204642.97162: waiting for pending results... 46400 1727204642.97367: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 46400 1727204642.97458: in run() - task 0affcd87-79f5-1303-fda8-0000000029a3 46400 1727204642.97473: variable 'ansible_search_path' from source: unknown 46400 1727204642.97476: variable 'ansible_search_path' from source: unknown 46400 1727204642.97505: calling self._execute() 46400 1727204642.97587: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.97591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.97599: variable 'omit' from source: magic vars 46400 1727204642.97890: variable 'ansible_distribution_major_version' from source: facts 46400 1727204642.97900: Evaluated conditional (ansible_distribution_major_version != '6'): True 46400 1727204642.98006: variable 'ansible_facts' from source: unknown 46400 1727204642.98527: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 46400 1727204642.98531: variable 'omit' from source: magic vars 46400 1727204642.98568: variable 'omit' from source: magic vars 46400 1727204642.98596: variable 'omit' from source: magic vars 46400 1727204642.98635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 46400 1727204642.98662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 46400 1727204642.98685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 46400 1727204642.98700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.98716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 46400 1727204642.98735: variable 'inventory_hostname' from source: host vars for 'managed-node2' 46400 1727204642.98739: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.98743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.98814: Set connection var ansible_shell_type to sh 46400 1727204642.98824: Set connection var ansible_shell_executable to /bin/sh 46400 1727204642.98830: Set connection var ansible_module_compression to ZIP_DEFLATED 46400 1727204642.98835: Set connection var ansible_connection to ssh 46400 1727204642.98840: Set connection var ansible_pipelining to False 46400 1727204642.98847: Set connection var ansible_timeout to 10 46400 1727204642.98871: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.98875: variable 'ansible_connection' from source: unknown 46400 1727204642.98878: variable 'ansible_module_compression' from source: unknown 46400 1727204642.98880: variable 'ansible_shell_type' from source: unknown 46400 1727204642.98883: variable 'ansible_shell_executable' from source: unknown 46400 1727204642.98886: variable 'ansible_host' from source: host vars for 'managed-node2' 46400 1727204642.98888: variable 'ansible_pipelining' from source: unknown 46400 1727204642.98890: variable 'ansible_timeout' from source: unknown 46400 1727204642.98893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 46400 1727204642.99004: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.99014: variable 'omit' from source: magic vars 46400 1727204642.99018: starting attempt loop 46400 1727204642.99022: running the handler 46400 1727204642.99031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 46400 1727204642.99050: _low_level_execute_command(): starting 46400 1727204642.99056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 46400 1727204642.99598: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204642.99609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204642.99640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.99658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204642.99713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204642.99726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204642.99781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.01367: stdout chunk (state=3): >>>/root <<< 46400 1727204643.01463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204643.01528: stderr chunk (state=3): >>><<< 46400 1727204643.01532: stdout chunk (state=3): >>><<< 46400 1727204643.01555: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204643.01571: _low_level_execute_command(): starting 46400 1727204643.01578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248 `" && echo ansible-tmp-1727204643.0155454-55621-277589687815248="` echo /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248 `" ) && sleep 0' 46400 1727204643.02063: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.02080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.02105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.02117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.02170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204643.02186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204643.02197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204643.02243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.04094: stdout chunk (state=3): >>>ansible-tmp-1727204643.0155454-55621-277589687815248=/root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248 <<< 46400 1727204643.04305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204643.04309: stdout chunk (state=3): >>><<< 46400 1727204643.04312: stderr chunk (state=3): >>><<< 46400 1727204643.04479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204643.0155454-55621-277589687815248=/root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204643.04482: variable 'ansible_module_compression' from source: unknown 46400 1727204643.04485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-46400rspozge8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 46400 1727204643.04487: variable 'ansible_facts' from source: unknown 46400 1727204643.04593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/AnsiballZ_command.py 46400 1727204643.04784: Sending initial data 46400 1727204643.04787: Sent initial data (156 bytes) 46400 1727204643.05673: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.05679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.05710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.05714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 46400 1727204643.05717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.05769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204643.05773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204643.05783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204643.05825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.07541: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 46400 1727204643.07570: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 46400 1727204643.07626: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-46400rspozge8/tmp9xsc5k4v /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/AnsiballZ_command.py <<< 46400 1727204643.07651: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 46400 1727204643.09049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204643.09053: stderr chunk (state=3): >>><<< 46400 1727204643.09065: stdout chunk (state=3): >>><<< 46400 1727204643.09085: done transferring module to remote 46400 1727204643.09096: _low_level_execute_command(): starting 46400 1727204643.09101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/ /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/AnsiballZ_command.py && sleep 0' 46400 1727204643.10435: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204643.11085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.11097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.11112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.11151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204643.11158: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204643.11171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.11185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204643.11194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204643.11201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204643.11209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.11217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.11229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.11237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204643.11247: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204643.11254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.11329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204643.11345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204643.11348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204643.11419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.13204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204643.13209: stdout chunk (state=3): >>><<< 46400 1727204643.13215: stderr chunk (state=3): >>><<< 46400 1727204643.13233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204643.13237: _low_level_execute_command(): starting 46400 1727204643.13243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/AnsiballZ_command.py && sleep 0' 46400 1727204643.13870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 46400 1727204643.13874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.13886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.13902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.13940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204643.13948: stderr chunk (state=3): >>>debug2: match not found <<< 46400 1727204643.13958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.13976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 46400 1727204643.13984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 46400 1727204643.13990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 46400 1727204643.13998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.14007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.14019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.14027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 46400 1727204643.14033: stderr chunk (state=3): >>>debug2: match found <<< 46400 1727204643.14044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.14113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204643.14138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204643.14142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204643.14218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.49301: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3910\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2487", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:04:03.271304", "end": "2024-09-24 15:04:03.491835", "delta": "0:00:00.220531", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 46400 1727204643.50646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 46400 1727204643.50706: stderr chunk (state=3): >>><<< 46400 1727204643.50710: stdout chunk (state=3): >>><<< 46400 1727204643.50729: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3910\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2487", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:04:03.271304", "end": "2024-09-24 15:04:03.491835", "delta": "0:00:00.220531", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 46400 1727204643.50763: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 46400 1727204643.50776: _low_level_execute_command(): starting 46400 1727204643.50780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204643.0155454-55621-277589687815248/ > /dev/null 2>&1 && sleep 0' 46400 1727204643.51263: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 46400 1727204643.51267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 46400 1727204643.51304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 46400 1727204643.51307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 46400 1727204643.51310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 46400 1727204643.51368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 46400 1727204643.51372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 46400 1727204643.51374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 46400 1727204643.51423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 46400 1727204643.53280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 46400 1727204643.53357: stderr chunk (state=3): >>><<< 46400 1727204643.53365: stdout chunk (state=3): >>><<< 46400 1727204643.53382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 46400 1727204643.53389: handler run complete 46400 1727204643.53416: Evaluated conditional (False): False 46400 1727204643.53427: attempt loop complete, returning result 46400 1727204643.53430: _execute() done 46400 1727204643.53433: dumping result to json 46400 1727204643.53439: done dumping result, returning 46400 1727204643.53450: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [0affcd87-79f5-1303-fda8-0000000029a3] 46400 1727204643.53452: sending task result for task 0affcd87-79f5-1303-fda8-0000000029a3 46400 1727204643.53574: done sending task result for task 0affcd87-79f5-1303-fda8-0000000029a3 46400 1727204643.53576: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.220531", "end": "2024-09-24 15:04:03.491835", "rc": 0, "start": "2024-09-24 15:04:03.271304" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3910 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2487 46400 1727204643.53642: no more pending results, returning what we have 46400 1727204643.53646: results queue empty 46400 1727204643.53647: checking for any_errors_fatal 46400 1727204643.53656: done checking for any_errors_fatal 46400 1727204643.53657: checking for max_fail_percentage 46400 1727204643.53659: done checking for max_fail_percentage 46400 1727204643.53662: checking to see if all hosts have failed and the running result is not ok 46400 1727204643.53663: done checking to see if all hosts have failed 46400 1727204643.53666: getting the remaining hosts for this loop 46400 1727204643.53668: done getting the remaining hosts for this loop 46400 1727204643.53672: getting the next task for host managed-node2 46400 1727204643.53683: done getting next task for host managed-node2 46400 1727204643.53685: ^ task is: TASK: meta (flush_handlers) 46400 1727204643.53688: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204643.53692: getting variables 46400 1727204643.53694: in VariableManager get_vars() 46400 1727204643.53737: Calling all_inventory to load vars for managed-node2 46400 1727204643.53740: Calling groups_inventory to load vars for managed-node2 46400 1727204643.53743: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204643.53754: Calling all_plugins_play to load vars for managed-node2 46400 1727204643.53756: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204643.53759: Calling groups_plugins_play to load vars for managed-node2 46400 1727204643.55015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204643.56081: done with get_vars() 46400 1727204643.56098: done getting variables 46400 1727204643.56152: in VariableManager get_vars() 46400 1727204643.56163: Calling all_inventory to load vars for managed-node2 46400 1727204643.56166: Calling groups_inventory to load vars for managed-node2 46400 1727204643.56168: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204643.56172: Calling all_plugins_play to load vars for managed-node2 46400 1727204643.56173: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204643.56175: Calling groups_plugins_play to load vars for managed-node2 46400 1727204643.57251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204643.58898: done with get_vars() 46400 1727204643.58932: done queuing things up, now waiting for results queue to drain 46400 1727204643.58934: results queue empty 46400 1727204643.58935: checking for any_errors_fatal 46400 1727204643.58940: done checking for any_errors_fatal 46400 1727204643.58940: checking for max_fail_percentage 46400 1727204643.58942: done checking for max_fail_percentage 46400 1727204643.58942: checking to see if all hosts have failed and the running result is not ok 46400 1727204643.58943: done checking to see if all hosts have failed 46400 1727204643.58944: getting the remaining hosts for this loop 46400 1727204643.58945: done getting the remaining hosts for this loop 46400 1727204643.58948: getting the next task for host managed-node2 46400 1727204643.58952: done getting next task for host managed-node2 46400 1727204643.58953: ^ task is: TASK: meta (flush_handlers) 46400 1727204643.58955: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204643.58957: getting variables 46400 1727204643.58958: in VariableManager get_vars() 46400 1727204643.58973: Calling all_inventory to load vars for managed-node2 46400 1727204643.58975: Calling groups_inventory to load vars for managed-node2 46400 1727204643.58978: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204643.58983: Calling all_plugins_play to load vars for managed-node2 46400 1727204643.58986: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204643.58989: Calling groups_plugins_play to load vars for managed-node2 46400 1727204643.60275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204643.61911: done with get_vars() 46400 1727204643.61939: done getting variables 46400 1727204643.61993: in VariableManager get_vars() 46400 1727204643.62008: Calling all_inventory to load vars for managed-node2 46400 1727204643.62010: Calling groups_inventory to load vars for managed-node2 46400 1727204643.62013: Calling all_plugins_inventory to load vars for managed-node2 46400 1727204643.62017: Calling all_plugins_play to load vars for managed-node2 46400 1727204643.62025: Calling groups_plugins_inventory to load vars for managed-node2 46400 1727204643.62028: Calling groups_plugins_play to load vars for managed-node2 46400 1727204643.63244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 46400 1727204643.64922: done with get_vars() 46400 1727204643.64959: done queuing things up, now waiting for results queue to drain 46400 1727204643.64961: results queue empty 46400 1727204643.64962: checking for any_errors_fatal 46400 1727204643.64965: done checking for any_errors_fatal 46400 1727204643.64966: checking for max_fail_percentage 46400 1727204643.64967: done checking for max_fail_percentage 46400 1727204643.64968: checking to see if all hosts have failed and the running result is not ok 46400 1727204643.64968: done checking to see if all hosts have failed 46400 1727204643.64969: getting the remaining hosts for this loop 46400 1727204643.64970: done getting the remaining hosts for this loop 46400 1727204643.64973: getting the next task for host managed-node2 46400 1727204643.64976: done getting next task for host managed-node2 46400 1727204643.64977: ^ task is: None 46400 1727204643.64979: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 46400 1727204643.64980: done queuing things up, now waiting for results queue to drain 46400 1727204643.64981: results queue empty 46400 1727204643.64982: checking for any_errors_fatal 46400 1727204643.64982: done checking for any_errors_fatal 46400 1727204643.64983: checking for max_fail_percentage 46400 1727204643.64984: done checking for max_fail_percentage 46400 1727204643.64985: checking to see if all hosts have failed and the running result is not ok 46400 1727204643.64985: done checking to see if all hosts have failed 46400 1727204643.64988: getting the next task for host managed-node2 46400 1727204643.64991: done getting next task for host managed-node2 46400 1727204643.64992: ^ task is: None 46400 1727204643.64993: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=333 changed=10 unreachable=0 failed=0 skipped=313 rescued=0 ignored=11 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.682) 0:02:13.935 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.68s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.79s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.68s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.64s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.61s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.61s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.60s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.57s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.56s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.56s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.27s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.19s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.18s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 46400 1727204643.65331: RUNNING CLEANUP